Compare commits
313 Commits
refactor-s
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
| 979e734bd9 | |||
| 448f6ad4f4 | |||
| e4a2b63a48 | |||
| ce4cd7daf1 | |||
| 9ab36145e5 | |||
| eb5c099eca | |||
| 37ea1f8537 | |||
| 79cb3e0100 | |||
| 378bf434fc | |||
| 3116fbbc91 | |||
| dfcdfbe335 | |||
| 029530e078 | |||
| ba5d460e92 | |||
| 34ea51b8bd | |||
| 6ab0a8b631 | |||
| 6a961ce88f | |||
| d14157f7ad | |||
| f312dd0dbb | |||
| 2f64656d4d | |||
| 9d22e7e8af | |||
| 0a1da37197 | |||
| 495f218f9a | |||
| fac8820208 | |||
| 217990d417 | |||
| 1960ae4924 | |||
| bcb867da69 | |||
| 398c645a98 | |||
| 7db98a4fa6 | |||
| ce37afb2bb | |||
| db90f397e8 | |||
| 498ad7a47f | |||
| 9e95fd8416 | |||
| ca4f722b47 | |||
| 1c00238414 | |||
| 448d19b840 | |||
| caebc37da0 | |||
| 6a3a782bff | |||
| 2f51b26418 | |||
| 92cc309489 | |||
| 1fdf91cb50 | |||
| 80d57918ae | |||
| d97925d5a1 | |||
| 4a11d20c4d | |||
| 2303c04123 | |||
| 2c978bf948 | |||
| 210671059a | |||
| 1f4ee5021e | |||
| 1e758696fd | |||
| a039a0fad3 | |||
| b3cc588293 | |||
| c9e4b6aa02 | |||
| 8be87bfdfb | |||
| 484c25575d | |||
| bbee44ecdc | |||
| 043bed4323 | |||
| 640ef81257 | |||
| 65ee5f898f | |||
| 27a8af7008 | |||
| 14d80fc903 | |||
| 87464ff138 | |||
| e3f1e399c2 | |||
| 3dd10d3dc7 | |||
| 5be52bcfeb | |||
| 75f0a5dd6e | |||
| 906a3b7cdd | |||
| 337667fc07 | |||
| ae93b9d428 | |||
| 8398368ed7 | |||
| cd2609da7c | |||
| 39db23d417 | |||
| 582f125897 | |||
| f46c367c27 | |||
| 21bdd9f2ba | |||
| 713f7475c9 | |||
| 6e651b5bb5 | |||
| f37936c84d | |||
| 159fcab17a | |||
| d06d3d84de | |||
| b0f80e0be7 | |||
| adb5dcea88 | |||
| da803da816 | |||
| e799edbae4 | |||
| 15bd6cddeb | |||
| 19414614bf | |||
| 4a2bebe249 | |||
| c0a50dedcd | |||
| c56d2b2201 | |||
| 7daa2e40c7 | |||
| ae6bd0d865 | |||
| a43a9f129f | |||
| 3ad1a19dce | |||
| a9114bc40a | |||
| 555ff62b56 | |||
| 7f94a41965 | |||
| 8b287ca6c9 | |||
| 97e57481f9 | |||
| 811ba8b3dc | |||
| b90c738fbb | |||
| dfaf24d74c | |||
| 0f2b85c6de | |||
| f4d1fd4de1 | |||
| ba92d66880 | |||
| afc70b5a95 | |||
| 84dad07e15 | |||
| 7f2ba4fbad | |||
| 4ba03c2a94 | |||
| 8036c99883 | |||
| b058b0fd6f | |||
| 7dda520c9b | |||
| 0a3e76128a | |||
| 5249cd6939 | |||
| 2f3314cd36 | |||
| 31e2c24a8a | |||
| 7be7266477 | |||
| 33653fdfd4 | |||
| 95dcf080e5 | |||
| 2e0838ca08 | |||
| 1b7fdb1739 | |||
| b23e361791 | |||
| 053a9e18cf | |||
| 6f7303c0d5 | |||
| 7f7edce62d | |||
| 6627b5eee7 | |||
| 5e7ef718e0 | |||
| 0c4264de44 | |||
| 7a8a5aee98 | |||
| c8cf375399 | |||
| 500de132b9 | |||
| ac4c6760d7 | |||
| 5796c6a21a | |||
| 302948a248 | |||
| e3819327a9 | |||
| 04306a7fef | |||
| b317246bcd | |||
| 848ba0a815 | |||
| 9ec774e956 | |||
| 9210d051a8 | |||
| 5a6a140dfd | |||
| 6f035e3706 | |||
| 6b64cf31c4 | |||
| 4b024e6d0f | |||
| f506a55d7b | |||
| 6a7b78c3eb | |||
| 7dcab1d7a3 | |||
| 931012c16b | |||
| 10772d1f80 | |||
| 7f10286e02 | |||
| 1cc3b05705 | |||
| 1866ff9ce6 | |||
| 1619091640 | |||
| 37fd28ec5a | |||
| bf87e03100 | |||
| 548a5a481d | |||
| a55f11bc96 | |||
| 9634ca8909 | |||
| 4f53cfffab | |||
| 7433b19b7e | |||
| 4191c52298 | |||
| 5bd1b33f5a | |||
| b73c77d811 | |||
| 65846042e2 | |||
| 2c73c3df52 | |||
| 4937ce4b05 | |||
| d07baa260c | |||
| 33e27a4f3e | |||
| 41c7084159 | |||
| 6fa15f7f57 | |||
| 2abaac22cf | |||
| 1d252b5299 | |||
| d7145874cf | |||
| ca7d9b2e3f | |||
| edd15dd556 | |||
| e11953736d | |||
| 1b9cd6d5e6 | |||
| 03f4b871a9 | |||
| 29770503bf | |||
| 7a0b2097ae | |||
| f87b93ce2f | |||
| f2e2aff17f | |||
| 6916e5b808 | |||
| 7d627cf128 | |||
| c265ab1245 | |||
| b63d15fd02 | |||
| 0278a8e4a6 | |||
| ef27660fc8 | |||
| 601fc80178 | |||
| 5adec042a4 | |||
| 9aeb0de936 | |||
| b22481d4ce | |||
| 1644b34d5c | |||
| b52c877367 | |||
| da376a8b18 | |||
| 9a9c597187 | |||
| b1a92c01fc | |||
| b65efd3b71 | |||
| 9e4d6fa715 | |||
| 836bc4294b | |||
| 39d676e5c8 | |||
| ef81c46bc0 | |||
| 40a4739349 | |||
| 3ff2a1bf45 | |||
| 3be82dc8c2 | |||
| 829edecbdc | |||
| a4bd738e6f | |||
| 4d9ef5b33b | |||
| d4826c8df4 | |||
| 967d92025c | |||
| eecc00e824 | |||
| d164ab932d | |||
| 96b0acacd2 | |||
| 08cead49fe | |||
| df01ee3de3 | |||
| 410b2ce308 | |||
| 0aca5fda5d | |||
| 526da02512 | |||
| 51aa57f304 | |||
| 3dc3774d76 | |||
| 1cd93d521e | |||
| 1521c2f221 | |||
| 2e68b29d9c | |||
| e62b05c224 | |||
| ca9112ebc0 | |||
| f843d71d6b | |||
| 9fb6e27256 | |||
| 49467ca6e9 | |||
| 913b485500 | |||
| 22651647cb | |||
| 9fa60434c1 | |||
| 514b68e34f | |||
| 961897ce2f | |||
| 86f7a513fe | |||
| c1562a27f4 | |||
| 888b5c3e40 | |||
| d1675dcc80 | |||
| ca4411f30f | |||
| 770a49b5f3 | |||
| b551365fb5 | |||
| 0ab13c282e | |||
| 1f1100c289 | |||
| 02ca9772d6 | |||
| 873f08042e | |||
| 0f072f4735 | |||
| d833a60ad4 | |||
| 4d9c59ccf7 | |||
| f2f089a223 | |||
| fed51453e4 | |||
| ed057fe545 | |||
| 4b8e6755dc | |||
| d13c2c7e25 | |||
| 0f019f87a4 | |||
| cf522190c6 | |||
| 329daaef1c | |||
| cbcb6a2a34 | |||
| baad096ead | |||
| 30df150b6f | |||
| c59c71a1c7 | |||
| 405abc1973 | |||
| d10f605d66 | |||
| 4e846605e9 | |||
| 32d53b447d | |||
| 1298bd235f | |||
| ddcd2f4350 | |||
| 73bea5ee86 | |||
| 7040931816 | |||
| ef8008a75d | |||
| e4f49c0351 | |||
| 4fcde4abfb | |||
| 8415509f4c | |||
| cd4d9124b0 | |||
| cbad50a987 | |||
| 3745ebd6cd | |||
| 0210844522 | |||
| 5da18de708 | |||
| 4e592dddc5 | |||
| adfa9ec139 | |||
| 85f5938d7d | |||
| 917c8937cf | |||
| 0c0b1ee811 | |||
| a27f090616 | |||
| 3eae7eb43f | |||
| b1a1925360 | |||
| ac56974e83 | |||
| 5ef6a80a1f | |||
| 365fe3d068 | |||
| 72d8dd8df7 | |||
| 18991025bf | |||
| bc4db19190 | |||
| 69b6f38c89 | |||
| 07a802dff6 | |||
| 7d6d9dabf2 | |||
| 8bb5d85c16 | |||
| 759d5e5162 | |||
| 9438b5d617 | |||
| 272c123952 | |||
| 91c8a5332f | |||
| a849d5db9e | |||
| ae9743d6ed | |||
| ae47652d0c | |||
| c002cb1e54 | |||
| 9387670a7b | |||
| a8df7f8359 | |||
| 2f302b26af | |||
| 26f8bcf86d | |||
| 95c57de8d0 | |||
| d4a8401a6a | |||
| b4a1856f79 | |||
| 9e6a542289 | |||
| aaf88a6f12 | |||
| b789c1bd44 | |||
| 5062aa8068 | |||
| 8a042589e7 | |||
| 3898b5ad45 | |||
| 9d15336144 |
1
.gitignore
vendored
1
.gitignore
vendored
|
|
@ -61,3 +61,4 @@ tmp/
|
||||||
|
|
||||||
#.claude Konfiguration
|
#.claude Konfiguration
|
||||||
.claude/
|
.claude/
|
||||||
|
.claude/settings.local.jsonfrontend/package-lock.json
|
||||||
|
|
|
||||||
253
backend/apply_v9c_migration.py
Normal file
253
backend/apply_v9c_migration.py
Normal file
|
|
@ -0,0 +1,253 @@
|
||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Apply v9c Subscription System Migration
|
||||||
|
|
||||||
|
This script checks if v9c migration is needed and applies it.
|
||||||
|
Run automatically on container startup via main.py startup event.
|
||||||
|
"""
|
||||||
|
import os
|
||||||
|
import psycopg2
|
||||||
|
from psycopg2.extras import RealDictCursor
|
||||||
|
|
||||||
|
|
||||||
|
def get_db_connection():
|
||||||
|
"""Get PostgreSQL connection."""
|
||||||
|
return psycopg2.connect(
|
||||||
|
host=os.getenv("DB_HOST", "postgres"),
|
||||||
|
port=int(os.getenv("DB_PORT", 5432)),
|
||||||
|
database=os.getenv("DB_NAME", "mitai_prod"),
|
||||||
|
user=os.getenv("DB_USER", "mitai_prod"),
|
||||||
|
password=os.getenv("DB_PASSWORD", ""),
|
||||||
|
cursor_factory=RealDictCursor
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def migration_needed(conn):
|
||||||
|
"""Check if v9c migration is needed."""
|
||||||
|
cur = conn.cursor()
|
||||||
|
|
||||||
|
# Check if tiers table exists
|
||||||
|
cur.execute("""
|
||||||
|
SELECT EXISTS (
|
||||||
|
SELECT FROM information_schema.tables
|
||||||
|
WHERE table_name = 'tiers'
|
||||||
|
)
|
||||||
|
""")
|
||||||
|
tiers_exists = cur.fetchone()['exists']
|
||||||
|
|
||||||
|
# Check if features table exists
|
||||||
|
cur.execute("""
|
||||||
|
SELECT EXISTS (
|
||||||
|
SELECT FROM information_schema.tables
|
||||||
|
WHERE table_name = 'features'
|
||||||
|
)
|
||||||
|
""")
|
||||||
|
features_exists = cur.fetchone()['exists']
|
||||||
|
|
||||||
|
cur.close()
|
||||||
|
|
||||||
|
# Migration needed if either table is missing
|
||||||
|
return not (tiers_exists and features_exists)
|
||||||
|
|
||||||
|
|
||||||
|
def apply_migration():
|
||||||
|
"""Apply v9c migration if needed."""
|
||||||
|
print("[v9c Migration] Checking if migration is needed...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
conn = get_db_connection()
|
||||||
|
|
||||||
|
if not migration_needed(conn):
|
||||||
|
print("[v9c Migration] Already applied, skipping.")
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
# Even if main migration is done, check cleanup
|
||||||
|
apply_cleanup_migration()
|
||||||
|
return
|
||||||
|
|
||||||
|
print("[v9c Migration] Applying subscription system migration...")
|
||||||
|
|
||||||
|
# Read migration SQL
|
||||||
|
migration_path = os.path.join(
|
||||||
|
os.path.dirname(__file__),
|
||||||
|
"migrations",
|
||||||
|
"v9c_subscription_system.sql"
|
||||||
|
)
|
||||||
|
|
||||||
|
with open(migration_path, 'r', encoding='utf-8') as f:
|
||||||
|
migration_sql = f.read()
|
||||||
|
|
||||||
|
# Execute migration
|
||||||
|
cur = conn.cursor()
|
||||||
|
cur.execute(migration_sql)
|
||||||
|
conn.commit()
|
||||||
|
cur.close()
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
print("[v9c Migration] ✅ Migration completed successfully!")
|
||||||
|
|
||||||
|
# Apply fix migration if exists
|
||||||
|
fix_migration_path = os.path.join(
|
||||||
|
os.path.dirname(__file__),
|
||||||
|
"migrations",
|
||||||
|
"v9c_fix_features.sql"
|
||||||
|
)
|
||||||
|
|
||||||
|
if os.path.exists(fix_migration_path):
|
||||||
|
print("[v9c Migration] Applying feature fixes...")
|
||||||
|
with open(fix_migration_path, 'r', encoding='utf-8') as f:
|
||||||
|
fix_sql = f.read()
|
||||||
|
|
||||||
|
conn = get_db_connection()
|
||||||
|
cur = conn.cursor()
|
||||||
|
cur.execute(fix_sql)
|
||||||
|
conn.commit()
|
||||||
|
cur.close()
|
||||||
|
conn.close()
|
||||||
|
print("[v9c Migration] ✅ Feature fixes applied!")
|
||||||
|
|
||||||
|
# Verify tables created
|
||||||
|
conn = get_db_connection()
|
||||||
|
cur = conn.cursor()
|
||||||
|
cur.execute("""
|
||||||
|
SELECT table_name FROM information_schema.tables
|
||||||
|
WHERE table_schema = 'public'
|
||||||
|
AND table_name IN ('tiers', 'features', 'tier_limits', 'access_grants', 'coupons')
|
||||||
|
ORDER BY table_name
|
||||||
|
""")
|
||||||
|
tables = [r['table_name'] for r in cur.fetchall()]
|
||||||
|
print(f"[v9c Migration] Created tables: {', '.join(tables)}")
|
||||||
|
|
||||||
|
# Verify initial data
|
||||||
|
cur.execute("SELECT COUNT(*) as count FROM tiers")
|
||||||
|
tier_count = cur.fetchone()['count']
|
||||||
|
cur.execute("SELECT COUNT(*) as count FROM features")
|
||||||
|
feature_count = cur.fetchone()['count']
|
||||||
|
cur.execute("SELECT COUNT(*) as count FROM tier_limits")
|
||||||
|
limit_count = cur.fetchone()['count']
|
||||||
|
|
||||||
|
print(f"[v9c Migration] Initial data: {tier_count} tiers, {feature_count} features, {limit_count} tier limits")
|
||||||
|
|
||||||
|
cur.close()
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
# After successful migration, apply cleanup
|
||||||
|
apply_cleanup_migration()
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"[v9c Migration] ❌ Error: {e}")
|
||||||
|
raise
|
||||||
|
|
||||||
|
|
||||||
|
def cleanup_features_needed(conn):
|
||||||
|
"""Check if feature cleanup migration is needed."""
|
||||||
|
cur = conn.cursor()
|
||||||
|
|
||||||
|
# Check if old export features still exist
|
||||||
|
cur.execute("""
|
||||||
|
SELECT COUNT(*) as count FROM features
|
||||||
|
WHERE id IN ('export_csv', 'export_json', 'export_zip')
|
||||||
|
""")
|
||||||
|
old_exports = cur.fetchone()['count']
|
||||||
|
|
||||||
|
# Check if csv_import needs to be renamed
|
||||||
|
cur.execute("""
|
||||||
|
SELECT COUNT(*) as count FROM features
|
||||||
|
WHERE id = 'csv_import'
|
||||||
|
""")
|
||||||
|
old_import = cur.fetchone()['count']
|
||||||
|
|
||||||
|
cur.close()
|
||||||
|
|
||||||
|
# Cleanup needed if old features exist
|
||||||
|
return old_exports > 0 or old_import > 0
|
||||||
|
|
||||||
|
|
||||||
|
def apply_cleanup_migration():
|
||||||
|
"""Apply v9c feature cleanup migration."""
|
||||||
|
print("[v9c Cleanup] Checking if cleanup migration is needed...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
conn = get_db_connection()
|
||||||
|
|
||||||
|
if not cleanup_features_needed(conn):
|
||||||
|
print("[v9c Cleanup] Already applied, skipping.")
|
||||||
|
conn.close()
|
||||||
|
return
|
||||||
|
|
||||||
|
print("[v9c Cleanup] Applying feature consolidation...")
|
||||||
|
|
||||||
|
# Show BEFORE state
|
||||||
|
cur = conn.cursor()
|
||||||
|
cur.execute("SELECT id, name FROM features ORDER BY category, id")
|
||||||
|
features_before = [f"{r['id']} ({r['name']})" for r in cur.fetchall()]
|
||||||
|
print(f"[v9c Cleanup] Features BEFORE: {len(features_before)} features")
|
||||||
|
for f in features_before:
|
||||||
|
print(f" - {f}")
|
||||||
|
cur.close()
|
||||||
|
|
||||||
|
# Read cleanup migration SQL
|
||||||
|
cleanup_path = os.path.join(
|
||||||
|
os.path.dirname(__file__),
|
||||||
|
"migrations",
|
||||||
|
"v9c_cleanup_features.sql"
|
||||||
|
)
|
||||||
|
|
||||||
|
if not os.path.exists(cleanup_path):
|
||||||
|
print(f"[v9c Cleanup] ⚠️ Cleanup migration file not found: {cleanup_path}")
|
||||||
|
conn.close()
|
||||||
|
return
|
||||||
|
|
||||||
|
with open(cleanup_path, 'r', encoding='utf-8') as f:
|
||||||
|
cleanup_sql = f.read()
|
||||||
|
|
||||||
|
# Execute cleanup migration
|
||||||
|
cur = conn.cursor()
|
||||||
|
cur.execute(cleanup_sql)
|
||||||
|
conn.commit()
|
||||||
|
cur.close()
|
||||||
|
|
||||||
|
# Show AFTER state
|
||||||
|
cur = conn.cursor()
|
||||||
|
cur.execute("SELECT id, name, category FROM features ORDER BY category, id")
|
||||||
|
features_after = cur.fetchall()
|
||||||
|
print(f"[v9c Cleanup] Features AFTER: {len(features_after)} features")
|
||||||
|
|
||||||
|
# Group by category
|
||||||
|
categories = {}
|
||||||
|
for f in features_after:
|
||||||
|
cat = f['category'] or 'other'
|
||||||
|
if cat not in categories:
|
||||||
|
categories[cat] = []
|
||||||
|
categories[cat].append(f"{f['id']} ({f['name']})")
|
||||||
|
|
||||||
|
for cat, feats in sorted(categories.items()):
|
||||||
|
print(f" {cat.upper()}:")
|
||||||
|
for f in feats:
|
||||||
|
print(f" - {f}")
|
||||||
|
|
||||||
|
# Verify tier_limits updated
|
||||||
|
cur.execute("""
|
||||||
|
SELECT tier_id, feature_id, limit_value
|
||||||
|
FROM tier_limits
|
||||||
|
WHERE feature_id IN ('data_export', 'data_import')
|
||||||
|
ORDER BY tier_id, feature_id
|
||||||
|
""")
|
||||||
|
limits = cur.fetchall()
|
||||||
|
print(f"[v9c Cleanup] Tier limits for data_export/data_import:")
|
||||||
|
for lim in limits:
|
||||||
|
limit_str = 'unlimited' if lim['limit_value'] is None else lim['limit_value']
|
||||||
|
print(f" {lim['tier_id']}.{lim['feature_id']} = {limit_str}")
|
||||||
|
|
||||||
|
cur.close()
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
print("[v9c Cleanup] ✅ Feature cleanup completed successfully!")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"[v9c Cleanup] ❌ Error: {e}")
|
||||||
|
raise
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
apply_migration()
|
||||||
258
backend/auth.py
258
backend/auth.py
|
|
@ -7,6 +7,7 @@ for FastAPI endpoints.
|
||||||
import hashlib
|
import hashlib
|
||||||
import secrets
|
import secrets
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
|
from datetime import datetime, timedelta
|
||||||
from fastapi import Header, Query, HTTPException
|
from fastapi import Header, Query, HTTPException
|
||||||
import bcrypt
|
import bcrypt
|
||||||
|
|
||||||
|
|
@ -114,3 +115,260 @@ def require_admin(x_auth_token: Optional[str] = Header(default=None)):
|
||||||
if session['role'] != 'admin':
|
if session['role'] != 'admin':
|
||||||
raise HTTPException(403, "Nur für Admins")
|
raise HTTPException(403, "Nur für Admins")
|
||||||
return session
|
return session
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Feature Access Control (v9c)
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
def get_effective_tier(profile_id: str, conn=None) -> str:
|
||||||
|
"""
|
||||||
|
Get the effective tier for a profile.
|
||||||
|
|
||||||
|
Checks for active access_grants first (from coupons, trials, etc.),
|
||||||
|
then falls back to profile.tier.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
profile_id: User profile ID
|
||||||
|
conn: Optional existing DB connection (to avoid pool exhaustion)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
tier_id (str): 'free', 'basic', 'premium', or 'selfhosted'
|
||||||
|
"""
|
||||||
|
# Use existing connection if provided, otherwise open new one
|
||||||
|
if conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Check for active access grants (highest priority)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT tier_id
|
||||||
|
FROM access_grants
|
||||||
|
WHERE profile_id = %s
|
||||||
|
AND is_active = true
|
||||||
|
AND valid_from <= CURRENT_TIMESTAMP
|
||||||
|
AND valid_until > CURRENT_TIMESTAMP
|
||||||
|
ORDER BY valid_until DESC
|
||||||
|
LIMIT 1
|
||||||
|
""", (profile_id,))
|
||||||
|
|
||||||
|
grant = cur.fetchone()
|
||||||
|
if grant:
|
||||||
|
return grant['tier_id']
|
||||||
|
|
||||||
|
# Fall back to profile tier
|
||||||
|
cur.execute("SELECT tier FROM profiles WHERE id = %s", (profile_id,))
|
||||||
|
profile = cur.fetchone()
|
||||||
|
return profile['tier'] if profile else 'free'
|
||||||
|
else:
|
||||||
|
# Open new connection if none provided
|
||||||
|
with get_db() as conn:
|
||||||
|
return get_effective_tier(profile_id, conn)
|
||||||
|
|
||||||
|
|
||||||
|
def check_feature_access(profile_id: str, feature_id: str, conn=None) -> dict:
|
||||||
|
"""
|
||||||
|
Check if a profile has access to a feature.
|
||||||
|
|
||||||
|
Access hierarchy:
|
||||||
|
1. User-specific restriction (user_feature_restrictions)
|
||||||
|
2. Tier limit (tier_limits)
|
||||||
|
3. Feature default (features.default_limit)
|
||||||
|
|
||||||
|
Args:
|
||||||
|
profile_id: User profile ID
|
||||||
|
feature_id: Feature ID to check
|
||||||
|
conn: Optional existing DB connection (to avoid pool exhaustion)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict: {
|
||||||
|
'allowed': bool,
|
||||||
|
'limit': int | None, # NULL = unlimited
|
||||||
|
'used': int,
|
||||||
|
'remaining': int | None, # NULL = unlimited
|
||||||
|
'reason': str # 'unlimited', 'within_limit', 'limit_exceeded', 'feature_disabled'
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
# Use existing connection if provided
|
||||||
|
if conn:
|
||||||
|
return _check_impl(profile_id, feature_id, conn)
|
||||||
|
else:
|
||||||
|
with get_db() as conn:
|
||||||
|
return _check_impl(profile_id, feature_id, conn)
|
||||||
|
|
||||||
|
|
||||||
|
def _check_impl(profile_id: str, feature_id: str, conn) -> dict:
|
||||||
|
"""Internal implementation of check_feature_access."""
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Get feature info
|
||||||
|
cur.execute("""
|
||||||
|
SELECT limit_type, reset_period, default_limit
|
||||||
|
FROM features
|
||||||
|
WHERE id = %s AND active = true
|
||||||
|
""", (feature_id,))
|
||||||
|
feature = cur.fetchone()
|
||||||
|
|
||||||
|
if not feature:
|
||||||
|
return {
|
||||||
|
'allowed': False,
|
||||||
|
'limit': None,
|
||||||
|
'used': 0,
|
||||||
|
'remaining': None,
|
||||||
|
'reason': 'feature_not_found'
|
||||||
|
}
|
||||||
|
|
||||||
|
# Priority 1: Check user-specific restriction
|
||||||
|
cur.execute("""
|
||||||
|
SELECT limit_value
|
||||||
|
FROM user_feature_restrictions
|
||||||
|
WHERE profile_id = %s AND feature_id = %s
|
||||||
|
""", (profile_id, feature_id))
|
||||||
|
restriction = cur.fetchone()
|
||||||
|
|
||||||
|
if restriction is not None:
|
||||||
|
limit = restriction['limit_value']
|
||||||
|
else:
|
||||||
|
# Priority 2: Check tier limit
|
||||||
|
tier_id = get_effective_tier(profile_id, conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT limit_value
|
||||||
|
FROM tier_limits
|
||||||
|
WHERE tier_id = %s AND feature_id = %s
|
||||||
|
""", (tier_id, feature_id))
|
||||||
|
tier_limit = cur.fetchone()
|
||||||
|
|
||||||
|
if tier_limit is not None:
|
||||||
|
limit = tier_limit['limit_value']
|
||||||
|
else:
|
||||||
|
# Priority 3: Feature default
|
||||||
|
limit = feature['default_limit']
|
||||||
|
|
||||||
|
# For boolean features (limit 0 = disabled, 1 = enabled)
|
||||||
|
if feature['limit_type'] == 'boolean':
|
||||||
|
allowed = limit == 1
|
||||||
|
return {
|
||||||
|
'allowed': allowed,
|
||||||
|
'limit': limit,
|
||||||
|
'used': 0,
|
||||||
|
'remaining': None,
|
||||||
|
'reason': 'enabled' if allowed else 'feature_disabled'
|
||||||
|
}
|
||||||
|
|
||||||
|
# For count-based features
|
||||||
|
# Check current usage
|
||||||
|
cur.execute("""
|
||||||
|
SELECT usage_count, reset_at
|
||||||
|
FROM user_feature_usage
|
||||||
|
WHERE profile_id = %s AND feature_id = %s
|
||||||
|
""", (profile_id, feature_id))
|
||||||
|
usage = cur.fetchone()
|
||||||
|
|
||||||
|
used = usage['usage_count'] if usage else 0
|
||||||
|
|
||||||
|
# Check if reset is needed
|
||||||
|
if usage and usage['reset_at'] and datetime.now() > usage['reset_at']:
|
||||||
|
# Reset usage
|
||||||
|
used = 0
|
||||||
|
next_reset = _calculate_next_reset(feature['reset_period'])
|
||||||
|
cur.execute("""
|
||||||
|
UPDATE user_feature_usage
|
||||||
|
SET usage_count = 0, reset_at = %s, updated = CURRENT_TIMESTAMP
|
||||||
|
WHERE profile_id = %s AND feature_id = %s
|
||||||
|
""", (next_reset, profile_id, feature_id))
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
# NULL limit = unlimited
|
||||||
|
if limit is None:
|
||||||
|
return {
|
||||||
|
'allowed': True,
|
||||||
|
'limit': None,
|
||||||
|
'used': used,
|
||||||
|
'remaining': None,
|
||||||
|
'reason': 'unlimited'
|
||||||
|
}
|
||||||
|
|
||||||
|
# 0 limit = disabled
|
||||||
|
if limit == 0:
|
||||||
|
return {
|
||||||
|
'allowed': False,
|
||||||
|
'limit': 0,
|
||||||
|
'used': used,
|
||||||
|
'remaining': 0,
|
||||||
|
'reason': 'feature_disabled'
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check if within limit
|
||||||
|
allowed = used < limit
|
||||||
|
remaining = limit - used if limit else None
|
||||||
|
|
||||||
|
return {
|
||||||
|
'allowed': allowed,
|
||||||
|
'limit': limit,
|
||||||
|
'used': used,
|
||||||
|
'remaining': remaining,
|
||||||
|
'reason': 'within_limit' if allowed else 'limit_exceeded'
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def increment_feature_usage(profile_id: str, feature_id: str) -> None:
|
||||||
|
"""
|
||||||
|
Increment usage counter for a feature.
|
||||||
|
|
||||||
|
Creates usage record if it doesn't exist, with reset_at based on
|
||||||
|
feature's reset_period.
|
||||||
|
"""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Get feature reset period
|
||||||
|
cur.execute("""
|
||||||
|
SELECT reset_period
|
||||||
|
FROM features
|
||||||
|
WHERE id = %s
|
||||||
|
""", (feature_id,))
|
||||||
|
feature = cur.fetchone()
|
||||||
|
|
||||||
|
if not feature:
|
||||||
|
return
|
||||||
|
|
||||||
|
reset_period = feature['reset_period']
|
||||||
|
next_reset = _calculate_next_reset(reset_period)
|
||||||
|
|
||||||
|
# Upsert usage
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO user_feature_usage (profile_id, feature_id, usage_count, reset_at)
|
||||||
|
VALUES (%s, %s, 1, %s)
|
||||||
|
ON CONFLICT (profile_id, feature_id)
|
||||||
|
DO UPDATE SET
|
||||||
|
usage_count = user_feature_usage.usage_count + 1,
|
||||||
|
updated = CURRENT_TIMESTAMP
|
||||||
|
""", (profile_id, feature_id, next_reset))
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
|
||||||
|
def _calculate_next_reset(reset_period: str) -> Optional[datetime]:
|
||||||
|
"""
|
||||||
|
Calculate next reset timestamp based on reset period.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
reset_period: 'never', 'daily', 'monthly'
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
datetime or None (for 'never')
|
||||||
|
"""
|
||||||
|
if reset_period == 'never':
|
||||||
|
return None
|
||||||
|
elif reset_period == 'daily':
|
||||||
|
# Reset at midnight
|
||||||
|
tomorrow = datetime.now().date() + timedelta(days=1)
|
||||||
|
return datetime.combine(tomorrow, datetime.min.time())
|
||||||
|
elif reset_period == 'monthly':
|
||||||
|
# Reset at start of next month
|
||||||
|
now = datetime.now()
|
||||||
|
if now.month == 12:
|
||||||
|
return datetime(now.year + 1, 1, 1)
|
||||||
|
else:
|
||||||
|
return datetime(now.year, now.month + 1, 1)
|
||||||
|
else:
|
||||||
|
return None
|
||||||
|
|
|
||||||
36
backend/check_features.py
Normal file
36
backend/check_features.py
Normal file
|
|
@ -0,0 +1,36 @@
|
||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Quick diagnostic script to check features table."""
|
||||||
|
|
||||||
|
from db import get_db, get_cursor
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
print("\n=== FEATURES TABLE ===")
|
||||||
|
cur.execute("SELECT id, name, active, limit_type, reset_period FROM features ORDER BY id")
|
||||||
|
features = cur.fetchall()
|
||||||
|
|
||||||
|
if not features:
|
||||||
|
print("❌ NO FEATURES FOUND! Migration failed!")
|
||||||
|
else:
|
||||||
|
for r in features:
|
||||||
|
print(f" {r['id']:30} {r['name']:40} active={r['active']} type={r['limit_type']:8} reset={r['reset_period']}")
|
||||||
|
|
||||||
|
print(f"\nTotal features: {len(features)}")
|
||||||
|
|
||||||
|
print("\n=== USER_FEATURE_USAGE (recent) ===")
|
||||||
|
cur.execute("""
|
||||||
|
SELECT profile_id, feature_id, usage_count, reset_at
|
||||||
|
FROM user_feature_usage
|
||||||
|
ORDER BY updated DESC
|
||||||
|
LIMIT 10
|
||||||
|
""")
|
||||||
|
usages = cur.fetchall()
|
||||||
|
|
||||||
|
if not usages:
|
||||||
|
print(" (no usage records yet)")
|
||||||
|
else:
|
||||||
|
for r in usages:
|
||||||
|
print(f" {r['profile_id'][:8]}... -> {r['feature_id']:30} used={r['usage_count']} reset_at={r['reset_at']}")
|
||||||
|
|
||||||
|
print(f"\nTotal usage records: {len(usages)}")
|
||||||
181
backend/check_migration_024.py
Normal file
181
backend/check_migration_024.py
Normal file
|
|
@ -0,0 +1,181 @@
|
||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Quick diagnostic: Check Migration 024 state
|
||||||
|
|
||||||
|
Run this inside the backend container:
|
||||||
|
docker exec bodytrack-dev-backend-1 python check_migration_024.py
|
||||||
|
"""
|
||||||
|
|
||||||
|
import psycopg2
|
||||||
|
import os
|
||||||
|
from psycopg2.extras import RealDictCursor
|
||||||
|
|
||||||
|
# Database connection
|
||||||
|
DB_HOST = os.getenv('DB_HOST', 'db')
|
||||||
|
DB_PORT = os.getenv('DB_PORT', '5432')
|
||||||
|
DB_NAME = os.getenv('DB_NAME', 'bodytrack')
|
||||||
|
DB_USER = os.getenv('DB_USER', 'bodytrack')
|
||||||
|
DB_PASS = os.getenv('DB_PASSWORD', '')
|
||||||
|
|
||||||
|
def main():
|
||||||
|
print("=" * 70)
|
||||||
|
print("Migration 024 Diagnostic")
|
||||||
|
print("=" * 70)
|
||||||
|
|
||||||
|
# Connect to database
|
||||||
|
conn = psycopg2.connect(
|
||||||
|
host=DB_HOST,
|
||||||
|
port=DB_PORT,
|
||||||
|
dbname=DB_NAME,
|
||||||
|
user=DB_USER,
|
||||||
|
password=DB_PASS
|
||||||
|
)
|
||||||
|
cur = conn.cursor(cursor_factory=RealDictCursor)
|
||||||
|
|
||||||
|
# 1. Check if table exists
|
||||||
|
print("\n1. Checking if goal_type_definitions table exists...")
|
||||||
|
cur.execute("""
|
||||||
|
SELECT EXISTS (
|
||||||
|
SELECT FROM information_schema.tables
|
||||||
|
WHERE table_name = 'goal_type_definitions'
|
||||||
|
)
|
||||||
|
""")
|
||||||
|
exists = cur.fetchone()['exists']
|
||||||
|
print(f" ✓ Table exists: {exists}")
|
||||||
|
|
||||||
|
if not exists:
|
||||||
|
print("\n❌ TABLE DOES NOT EXIST - Migration 024 did not run!")
|
||||||
|
print("\nRECOMMENDED ACTION:")
|
||||||
|
print(" 1. Restart backend container: docker restart bodytrack-dev-backend-1")
|
||||||
|
print(" 2. Check logs: docker logs bodytrack-dev-backend-1 | grep 'Migration'")
|
||||||
|
cur.close()
|
||||||
|
conn.close()
|
||||||
|
return
|
||||||
|
|
||||||
|
# 2. Check row count
|
||||||
|
print("\n2. Checking row count...")
|
||||||
|
cur.execute("SELECT COUNT(*) as count FROM goal_type_definitions")
|
||||||
|
count = cur.fetchone()['count']
|
||||||
|
print(f" Row count: {count}")
|
||||||
|
|
||||||
|
if count == 0:
|
||||||
|
print("\n❌ TABLE IS EMPTY - Seed data was not inserted!")
|
||||||
|
print("\nPOSSIBLE CAUSES:")
|
||||||
|
print(" - INSERT statements failed (constraint violation?)")
|
||||||
|
print(" - Migration ran partially")
|
||||||
|
print("\nRECOMMENDED ACTION:")
|
||||||
|
print(" Run the seed statements manually (see below)")
|
||||||
|
else:
|
||||||
|
print(f" ✓ Table has {count} entries")
|
||||||
|
|
||||||
|
# 3. Show all entries
|
||||||
|
print("\n3. Current goal type definitions:")
|
||||||
|
cur.execute("""
|
||||||
|
SELECT type_key, label_de, unit, is_system, is_active, created_at
|
||||||
|
FROM goal_type_definitions
|
||||||
|
ORDER BY is_system DESC, type_key
|
||||||
|
""")
|
||||||
|
|
||||||
|
entries = cur.fetchall()
|
||||||
|
if entries:
|
||||||
|
print(f"\n {'Type Key':<20} {'Label':<20} {'Unit':<10} {'System':<8} {'Active':<8}")
|
||||||
|
print(" " + "-" * 70)
|
||||||
|
for row in entries:
|
||||||
|
status = "SYSTEM" if row['is_system'] else "CUSTOM"
|
||||||
|
active = "YES" if row['is_active'] else "NO"
|
||||||
|
print(f" {row['type_key']:<20} {row['label_de']:<20} {row['unit']:<10} {status:<8} {active:<8}")
|
||||||
|
else:
|
||||||
|
print(" (empty)")
|
||||||
|
|
||||||
|
# 4. Check schema_migrations
|
||||||
|
print("\n4. Checking schema_migrations tracking...")
|
||||||
|
cur.execute("""
|
||||||
|
SELECT EXISTS (
|
||||||
|
SELECT FROM information_schema.tables
|
||||||
|
WHERE table_name = 'schema_migrations'
|
||||||
|
)
|
||||||
|
""")
|
||||||
|
sm_exists = cur.fetchone()['exists']
|
||||||
|
|
||||||
|
if sm_exists:
|
||||||
|
cur.execute("""
|
||||||
|
SELECT filename, executed_at
|
||||||
|
FROM schema_migrations
|
||||||
|
WHERE filename = '024_goal_type_registry.sql'
|
||||||
|
""")
|
||||||
|
tracked = cur.fetchone()
|
||||||
|
if tracked:
|
||||||
|
print(f" ✓ Migration 024 is tracked (executed: {tracked['executed_at']})")
|
||||||
|
else:
|
||||||
|
print(" ❌ Migration 024 is NOT tracked in schema_migrations")
|
||||||
|
else:
|
||||||
|
print(" ⚠️ schema_migrations table does not exist")
|
||||||
|
|
||||||
|
# 5. Check for errors
|
||||||
|
print("\n5. Potential issues:")
|
||||||
|
issues = []
|
||||||
|
|
||||||
|
if count == 0:
|
||||||
|
issues.append("No seed data - INSERTs failed")
|
||||||
|
|
||||||
|
if count > 0 and count < 6:
|
||||||
|
issues.append(f"Only {count} types (expected 8) - partial seed")
|
||||||
|
|
||||||
|
cur.execute("""
|
||||||
|
SELECT COUNT(*) as inactive_count
|
||||||
|
FROM goal_type_definitions
|
||||||
|
WHERE is_active = false
|
||||||
|
""")
|
||||||
|
inactive = cur.fetchone()['inactive_count']
|
||||||
|
if inactive > 2:
|
||||||
|
issues.append(f"{inactive} inactive types (expected 2)")
|
||||||
|
|
||||||
|
if not issues:
|
||||||
|
print(" ✓ No issues detected")
|
||||||
|
else:
|
||||||
|
for issue in issues:
|
||||||
|
print(f" ❌ {issue}")
|
||||||
|
|
||||||
|
# 6. Test query that frontend uses
|
||||||
|
print("\n6. Testing frontend query (WHERE is_active = true)...")
|
||||||
|
cur.execute("""
|
||||||
|
SELECT COUNT(*) as active_count
|
||||||
|
FROM goal_type_definitions
|
||||||
|
WHERE is_active = true
|
||||||
|
""")
|
||||||
|
active_count = cur.fetchone()['active_count']
|
||||||
|
print(f" Active types returned: {active_count}")
|
||||||
|
|
||||||
|
if active_count == 0:
|
||||||
|
print(" ❌ This is why frontend shows empty list!")
|
||||||
|
|
||||||
|
print("\n" + "=" * 70)
|
||||||
|
print("SUMMARY")
|
||||||
|
print("=" * 70)
|
||||||
|
|
||||||
|
if count == 0:
|
||||||
|
print("\n🔴 PROBLEM: Table exists but has no data")
|
||||||
|
print("\nQUICK FIX: Run these SQL commands manually:")
|
||||||
|
print("\n```sql")
|
||||||
|
print("-- Connect to database:")
|
||||||
|
print("docker exec -it bodytrack-dev-db-1 psql -U bodytrack -d bodytrack")
|
||||||
|
print("\n-- Then paste migration content:")
|
||||||
|
print("-- (copy from backend/migrations/024_goal_type_registry.sql)")
|
||||||
|
print("-- Skip CREATE TABLE (already exists), run INSERT statements only")
|
||||||
|
print("```")
|
||||||
|
elif active_count >= 6:
|
||||||
|
print("\n🟢 EVERYTHING LOOKS GOOD")
|
||||||
|
print(f" {active_count} active goal types available")
|
||||||
|
print("\nIf frontend still shows error, check:")
|
||||||
|
print(" 1. Backend logs: docker logs bodytrack-dev-backend-1 -f")
|
||||||
|
print(" 2. Network tab in browser DevTools")
|
||||||
|
print(" 3. API endpoint: curl -H 'X-Auth-Token: YOUR_TOKEN' http://localhost:8099/api/goals/goal-types")
|
||||||
|
else:
|
||||||
|
print(f"\n🟡 PARTIAL DATA: {active_count} active types (expected 6)")
|
||||||
|
print(" Some INSERTs might have failed")
|
||||||
|
|
||||||
|
cur.close()
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
||||||
|
|
@ -91,9 +91,113 @@ def get_profile_count():
|
||||||
print(f"Error getting profile count: {e}")
|
print(f"Error getting profile count: {e}")
|
||||||
return -1
|
return -1
|
||||||
|
|
||||||
|
def ensure_migration_table():
|
||||||
|
"""Create migration tracking table if it doesn't exist."""
|
||||||
|
try:
|
||||||
|
conn = get_connection()
|
||||||
|
cur = conn.cursor()
|
||||||
|
cur.execute("""
|
||||||
|
CREATE TABLE IF NOT EXISTS schema_migrations (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
filename VARCHAR(255) UNIQUE NOT NULL,
|
||||||
|
applied_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
|
||||||
|
)
|
||||||
|
""")
|
||||||
|
conn.commit()
|
||||||
|
cur.close()
|
||||||
|
conn.close()
|
||||||
|
return True
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error creating migration table: {e}")
|
||||||
|
return False
|
||||||
|
|
||||||
|
def get_applied_migrations():
|
||||||
|
"""Get list of already applied migrations."""
|
||||||
|
try:
|
||||||
|
conn = get_connection()
|
||||||
|
cur = conn.cursor()
|
||||||
|
cur.execute("SELECT filename FROM schema_migrations ORDER BY filename")
|
||||||
|
migrations = [row[0] for row in cur.fetchall()]
|
||||||
|
cur.close()
|
||||||
|
conn.close()
|
||||||
|
return migrations
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error getting applied migrations: {e}")
|
||||||
|
return []
|
||||||
|
|
||||||
|
def apply_migration(filepath, filename):
|
||||||
|
"""Apply a single migration file."""
|
||||||
|
try:
|
||||||
|
with open(filepath, 'r') as f:
|
||||||
|
migration_sql = f.read()
|
||||||
|
|
||||||
|
conn = get_connection()
|
||||||
|
cur = conn.cursor()
|
||||||
|
|
||||||
|
# Execute migration
|
||||||
|
cur.execute(migration_sql)
|
||||||
|
|
||||||
|
# Record migration
|
||||||
|
cur.execute(
|
||||||
|
"INSERT INTO schema_migrations (filename) VALUES (%s)",
|
||||||
|
(filename,)
|
||||||
|
)
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
cur.close()
|
||||||
|
conn.close()
|
||||||
|
print(f" ✓ Applied: {filename}")
|
||||||
|
return True
|
||||||
|
except Exception as e:
|
||||||
|
print(f" ✗ Failed to apply {filename}: {e}")
|
||||||
|
return False
|
||||||
|
|
||||||
|
def run_migrations(migrations_dir="/app/migrations"):
|
||||||
|
"""Run all pending migrations."""
|
||||||
|
import glob
|
||||||
|
import re
|
||||||
|
|
||||||
|
if not os.path.exists(migrations_dir):
|
||||||
|
print("✓ No migrations directory found")
|
||||||
|
return True
|
||||||
|
|
||||||
|
# Ensure migration tracking table exists
|
||||||
|
if not ensure_migration_table():
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Get already applied migrations
|
||||||
|
applied = get_applied_migrations()
|
||||||
|
|
||||||
|
# Get all migration files (only numbered migrations like 001_*.sql)
|
||||||
|
all_files = sorted(glob.glob(os.path.join(migrations_dir, "*.sql")))
|
||||||
|
migration_pattern = re.compile(r'^\d{3}_.*\.sql$')
|
||||||
|
migration_files = [f for f in all_files if migration_pattern.match(os.path.basename(f))]
|
||||||
|
|
||||||
|
if not migration_files:
|
||||||
|
print("✓ No migration files found")
|
||||||
|
return True
|
||||||
|
|
||||||
|
# Apply pending migrations
|
||||||
|
pending = []
|
||||||
|
for filepath in migration_files:
|
||||||
|
filename = os.path.basename(filepath)
|
||||||
|
if filename not in applied:
|
||||||
|
pending.append((filepath, filename))
|
||||||
|
|
||||||
|
if not pending:
|
||||||
|
print(f"✓ All {len(applied)} migrations already applied")
|
||||||
|
return True
|
||||||
|
|
||||||
|
print(f" Found {len(pending)} pending migration(s)...")
|
||||||
|
for filepath, filename in pending:
|
||||||
|
if not apply_migration(filepath, filename):
|
||||||
|
return False
|
||||||
|
|
||||||
|
return True
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
print("═══════════════════════════════════════════════════════════")
|
print("═══════════════════════════════════════════════════════════")
|
||||||
print("MITAI JINKENDO - Database Initialization (v9b)")
|
print("MITAI JINKENDO - Database Initialization (v9c)")
|
||||||
print("═══════════════════════════════════════════════════════════")
|
print("═══════════════════════════════════════════════════════════")
|
||||||
|
|
||||||
# Wait for PostgreSQL
|
# Wait for PostgreSQL
|
||||||
|
|
@ -109,6 +213,12 @@ if __name__ == "__main__":
|
||||||
else:
|
else:
|
||||||
print("✓ Schema already exists")
|
print("✓ Schema already exists")
|
||||||
|
|
||||||
|
# Run migrations
|
||||||
|
print("\nRunning database migrations...")
|
||||||
|
if not run_migrations():
|
||||||
|
print("✗ Migration failed")
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
# Check for migration
|
# Check for migration
|
||||||
print("\nChecking for SQLite data migration...")
|
print("\nChecking for SQLite data migration...")
|
||||||
sqlite_db = "/app/data/bodytrack.db"
|
sqlite_db = "/app/data/bodytrack.db"
|
||||||
|
|
|
||||||
287
backend/evaluation_helper.py
Normal file
287
backend/evaluation_helper.py
Normal file
|
|
@ -0,0 +1,287 @@
|
||||||
|
"""
|
||||||
|
Training Type Profiles - Helper Functions
|
||||||
|
Utilities for loading parameters, profiles, and running evaluations.
|
||||||
|
|
||||||
|
Issue: #15
|
||||||
|
Date: 2026-03-23
|
||||||
|
"""
|
||||||
|
from typing import Dict, Optional, List
|
||||||
|
from decimal import Decimal
|
||||||
|
import logging
|
||||||
|
|
||||||
|
from db import get_cursor
|
||||||
|
from profile_evaluator import TrainingProfileEvaluator
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
def convert_decimals(obj):
|
||||||
|
"""
|
||||||
|
Recursively converts Decimal objects to float for JSON serialization.
|
||||||
|
|
||||||
|
PostgreSQL returns numeric values as Decimal, but psycopg2.Json() can't serialize them.
|
||||||
|
"""
|
||||||
|
if isinstance(obj, Decimal):
|
||||||
|
return float(obj)
|
||||||
|
elif isinstance(obj, dict):
|
||||||
|
return {k: convert_decimals(v) for k, v in obj.items()}
|
||||||
|
elif isinstance(obj, list):
|
||||||
|
return [convert_decimals(item) for item in obj]
|
||||||
|
return obj
|
||||||
|
|
||||||
|
|
||||||
|
def load_parameters_registry(cur) -> Dict[str, Dict]:
|
||||||
|
"""
|
||||||
|
Loads training parameters registry from database.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict mapping parameter_key -> config
|
||||||
|
"""
|
||||||
|
cur.execute("""
|
||||||
|
SELECT key, name_de, name_en, category, data_type, unit,
|
||||||
|
description_de, source_field, validation_rules
|
||||||
|
FROM training_parameters
|
||||||
|
WHERE is_active = true
|
||||||
|
""")
|
||||||
|
|
||||||
|
registry = {}
|
||||||
|
for row in cur.fetchall():
|
||||||
|
registry[row['key']] = dict(row)
|
||||||
|
|
||||||
|
return registry
|
||||||
|
|
||||||
|
|
||||||
|
def load_training_type_profile(cur, training_type_id: int) -> Optional[Dict]:
|
||||||
|
"""
|
||||||
|
Loads training type profile for a given type ID.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Profile JSONB or None if not configured
|
||||||
|
"""
|
||||||
|
cur.execute(
|
||||||
|
"SELECT profile FROM training_types WHERE id = %s",
|
||||||
|
(training_type_id,)
|
||||||
|
)
|
||||||
|
row = cur.fetchone()
|
||||||
|
|
||||||
|
if row and row['profile']:
|
||||||
|
return row['profile']
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def load_evaluation_context(
|
||||||
|
cur,
|
||||||
|
profile_id: str,
|
||||||
|
activity_date: str,
|
||||||
|
lookback_days: int = 30
|
||||||
|
) -> Dict:
|
||||||
|
"""
|
||||||
|
Loads context data for evaluation (user profile + recent activities).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
cur: Database cursor
|
||||||
|
profile_id: User profile ID
|
||||||
|
activity_date: Date of activity being evaluated
|
||||||
|
lookback_days: How many days of history to load
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
{
|
||||||
|
"user_profile": {...},
|
||||||
|
"recent_activities": [...],
|
||||||
|
"historical_activities": [...]
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
# Load user profile
|
||||||
|
cur.execute(
|
||||||
|
"SELECT hf_max, sleep_goal_minutes FROM profiles WHERE id = %s",
|
||||||
|
(profile_id,)
|
||||||
|
)
|
||||||
|
user_row = cur.fetchone()
|
||||||
|
user_profile = dict(user_row) if user_row else {}
|
||||||
|
|
||||||
|
# Load recent activities (last N days)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id, date, training_type_id, duration_min, hr_avg, hr_max,
|
||||||
|
distance_km, kcal_active, rpe
|
||||||
|
FROM activity_log
|
||||||
|
WHERE profile_id = %s
|
||||||
|
AND date >= %s::date - INTERVAL '%s days'
|
||||||
|
AND date < %s::date
|
||||||
|
ORDER BY date DESC
|
||||||
|
LIMIT 50
|
||||||
|
""", (profile_id, activity_date, lookback_days, activity_date))
|
||||||
|
|
||||||
|
recent_activities = [dict(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
# Historical activities (same for MVP)
|
||||||
|
historical_activities = recent_activities
|
||||||
|
|
||||||
|
return {
|
||||||
|
"user_profile": user_profile,
|
||||||
|
"recent_activities": recent_activities,
|
||||||
|
"historical_activities": historical_activities
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def evaluate_and_save_activity(
|
||||||
|
cur,
|
||||||
|
activity_id: str,
|
||||||
|
activity_data: Dict,
|
||||||
|
training_type_id: int,
|
||||||
|
profile_id: str
|
||||||
|
) -> Optional[Dict]:
|
||||||
|
"""
|
||||||
|
Evaluates an activity and saves the result to the database.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
cur: Database cursor
|
||||||
|
activity_id: Activity ID
|
||||||
|
activity_data: Activity data dict
|
||||||
|
training_type_id: Training type ID
|
||||||
|
profile_id: User profile ID
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Evaluation result or None if no profile configured
|
||||||
|
"""
|
||||||
|
# Load profile
|
||||||
|
profile = load_training_type_profile(cur, training_type_id)
|
||||||
|
if not profile:
|
||||||
|
logger.info(f"[EVALUATION] No profile for training_type {training_type_id}, skipping")
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Load parameters registry
|
||||||
|
parameters = load_parameters_registry(cur)
|
||||||
|
|
||||||
|
# Load context
|
||||||
|
context = load_evaluation_context(
|
||||||
|
cur,
|
||||||
|
profile_id,
|
||||||
|
activity_data.get("date"),
|
||||||
|
lookback_days=30
|
||||||
|
)
|
||||||
|
|
||||||
|
# Convert Decimal values in activity_data and context
|
||||||
|
activity_data_clean = convert_decimals(activity_data)
|
||||||
|
context_clean = convert_decimals(context)
|
||||||
|
|
||||||
|
# Evaluate
|
||||||
|
evaluator = TrainingProfileEvaluator(parameters)
|
||||||
|
evaluation_result = evaluator.evaluate_activity(
|
||||||
|
activity_data_clean,
|
||||||
|
profile,
|
||||||
|
context_clean
|
||||||
|
)
|
||||||
|
|
||||||
|
# Save to database
|
||||||
|
from psycopg2.extras import Json
|
||||||
|
|
||||||
|
# Convert Decimal to float for JSON serialization
|
||||||
|
evaluation_result_clean = convert_decimals(evaluation_result)
|
||||||
|
|
||||||
|
cur.execute("""
|
||||||
|
UPDATE activity_log
|
||||||
|
SET evaluation = %s,
|
||||||
|
quality_label = %s,
|
||||||
|
overall_score = %s
|
||||||
|
WHERE id = %s
|
||||||
|
""", (
|
||||||
|
Json(evaluation_result_clean),
|
||||||
|
evaluation_result_clean.get("quality_label"),
|
||||||
|
evaluation_result_clean.get("overall_score"),
|
||||||
|
activity_id
|
||||||
|
))
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
f"[EVALUATION] Activity {activity_id}: "
|
||||||
|
f"{evaluation_result.get('quality_label')} "
|
||||||
|
f"(score: {evaluation_result.get('overall_score')})"
|
||||||
|
)
|
||||||
|
|
||||||
|
return evaluation_result
|
||||||
|
|
||||||
|
|
||||||
|
def batch_evaluate_activities(
|
||||||
|
cur,
|
||||||
|
profile_id: str,
|
||||||
|
limit: Optional[int] = None
|
||||||
|
) -> Dict:
|
||||||
|
"""
|
||||||
|
Re-evaluates all activities for a user.
|
||||||
|
|
||||||
|
Useful for:
|
||||||
|
- Initial setup after profiles are configured
|
||||||
|
- Re-evaluation after profile changes
|
||||||
|
|
||||||
|
Args:
|
||||||
|
cur: Database cursor
|
||||||
|
profile_id: User profile ID
|
||||||
|
limit: Optional limit for testing
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
{
|
||||||
|
"total": int,
|
||||||
|
"evaluated": int,
|
||||||
|
"skipped": int,
|
||||||
|
"errors": int
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
# Load all activities
|
||||||
|
query = """
|
||||||
|
SELECT id, profile_id, date, training_type_id, duration_min,
|
||||||
|
hr_avg, hr_max, distance_km, kcal_active, kcal_resting,
|
||||||
|
rpe, pace_min_per_km, cadence, elevation_gain
|
||||||
|
FROM activity_log
|
||||||
|
WHERE profile_id = %s
|
||||||
|
ORDER BY date DESC
|
||||||
|
"""
|
||||||
|
params = [profile_id]
|
||||||
|
|
||||||
|
if limit:
|
||||||
|
query += " LIMIT %s"
|
||||||
|
params.append(limit)
|
||||||
|
|
||||||
|
cur.execute(query, params)
|
||||||
|
activities = cur.fetchall()
|
||||||
|
|
||||||
|
stats = {
|
||||||
|
"total": len(activities),
|
||||||
|
"evaluated": 0,
|
||||||
|
"skipped": 0,
|
||||||
|
"errors": 0
|
||||||
|
}
|
||||||
|
|
||||||
|
# Track error details
|
||||||
|
error_details = []
|
||||||
|
|
||||||
|
for activity in activities:
|
||||||
|
activity_dict = dict(activity)
|
||||||
|
try:
|
||||||
|
result = evaluate_and_save_activity(
|
||||||
|
cur,
|
||||||
|
activity_dict["id"],
|
||||||
|
activity_dict,
|
||||||
|
activity_dict["training_type_id"],
|
||||||
|
profile_id
|
||||||
|
)
|
||||||
|
|
||||||
|
if result:
|
||||||
|
stats["evaluated"] += 1
|
||||||
|
else:
|
||||||
|
stats["skipped"] += 1
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"[BATCH-EVAL] Error evaluating {activity_dict['id']}: {e}")
|
||||||
|
error_details.append({
|
||||||
|
"activity_id": activity_dict['id'],
|
||||||
|
"training_type_id": activity_dict.get('training_type_id'),
|
||||||
|
"error": str(e)
|
||||||
|
})
|
||||||
|
stats["errors"] += 1
|
||||||
|
|
||||||
|
# Add error details to stats (limit to first 10)
|
||||||
|
if error_details:
|
||||||
|
stats["error_details"] = error_details[:10]
|
||||||
|
|
||||||
|
logger.info(f"[BATCH-EVAL] Completed: {stats}")
|
||||||
|
return stats
|
||||||
76
backend/feature_logger.py
Normal file
76
backend/feature_logger.py
Normal file
|
|
@ -0,0 +1,76 @@
|
||||||
|
"""
|
||||||
|
Feature Usage Logger for Mitai Jinkendo
|
||||||
|
|
||||||
|
Logs all feature access checks to a separate JSON log file for analysis.
|
||||||
|
Phase 2: Non-blocking monitoring of feature usage.
|
||||||
|
"""
|
||||||
|
import logging
|
||||||
|
import json
|
||||||
|
from datetime import datetime
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
|
||||||
|
# ── Setup Feature Usage Logger ───────────────────────────────────────────────
|
||||||
|
feature_usage_logger = logging.getLogger('feature_usage')
|
||||||
|
feature_usage_logger.setLevel(logging.INFO)
|
||||||
|
feature_usage_logger.propagate = False # Don't propagate to root logger
|
||||||
|
|
||||||
|
# Ensure logs directory exists
|
||||||
|
LOG_DIR = Path('/app/logs')
|
||||||
|
LOG_DIR.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
# FileHandler for JSON logs
|
||||||
|
log_file = LOG_DIR / 'feature-usage.log'
|
||||||
|
file_handler = logging.FileHandler(log_file)
|
||||||
|
file_handler.setLevel(logging.INFO)
|
||||||
|
file_handler.setFormatter(logging.Formatter('%(message)s')) # JSON only
|
||||||
|
feature_usage_logger.addHandler(file_handler)
|
||||||
|
|
||||||
|
# Also log to console in dev (optional)
|
||||||
|
# console_handler = logging.StreamHandler()
|
||||||
|
# console_handler.setFormatter(logging.Formatter('[FEATURE-USAGE] %(message)s'))
|
||||||
|
# feature_usage_logger.addHandler(console_handler)
|
||||||
|
|
||||||
|
|
||||||
|
# ── Logging Function ──────────────────────────────────────────────────────────
|
||||||
|
def log_feature_usage(user_id: str, feature_id: str, access: dict, action: str):
|
||||||
|
"""
|
||||||
|
Log feature usage in structured JSON format.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
user_id: Profile UUID
|
||||||
|
feature_id: Feature identifier (e.g., 'weight_entries', 'ai_calls')
|
||||||
|
access: Result from check_feature_access() containing:
|
||||||
|
- allowed: bool
|
||||||
|
- limit: int | None
|
||||||
|
- used: int
|
||||||
|
- remaining: int | None
|
||||||
|
- reason: str
|
||||||
|
action: Type of action (e.g., 'create', 'export', 'analyze')
|
||||||
|
|
||||||
|
Example log entry:
|
||||||
|
{
|
||||||
|
"timestamp": "2026-03-20T15:30:45.123456",
|
||||||
|
"user_id": "abc-123",
|
||||||
|
"feature": "weight_entries",
|
||||||
|
"action": "create",
|
||||||
|
"used": 5,
|
||||||
|
"limit": 100,
|
||||||
|
"remaining": 95,
|
||||||
|
"allowed": true,
|
||||||
|
"reason": "within_limit"
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
entry = {
|
||||||
|
"timestamp": datetime.now().isoformat(),
|
||||||
|
"user_id": user_id,
|
||||||
|
"feature": feature_id,
|
||||||
|
"action": action,
|
||||||
|
"used": access.get('used', 0),
|
||||||
|
"limit": access.get('limit'), # None for unlimited
|
||||||
|
"remaining": access.get('remaining'), # None for unlimited
|
||||||
|
"allowed": access.get('allowed', True),
|
||||||
|
"reason": access.get('reason', 'unknown')
|
||||||
|
}
|
||||||
|
|
||||||
|
feature_usage_logger.info(json.dumps(entry))
|
||||||
215
backend/fix_seed_goal_types.py
Normal file
215
backend/fix_seed_goal_types.py
Normal file
|
|
@ -0,0 +1,215 @@
|
||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Quick Fix: Insert seed data for goal_type_definitions
|
||||||
|
|
||||||
|
This script ONLY inserts the 8 standard goal types.
|
||||||
|
Safe to run multiple times (uses ON CONFLICT DO NOTHING).
|
||||||
|
|
||||||
|
Run inside backend container:
|
||||||
|
docker exec bodytrack-dev-backend-1 python fix_seed_goal_types.py
|
||||||
|
"""
|
||||||
|
|
||||||
|
import psycopg2
|
||||||
|
import os
|
||||||
|
from psycopg2.extras import RealDictCursor
|
||||||
|
|
||||||
|
# Database connection
|
||||||
|
DB_HOST = os.getenv('DB_HOST', 'db')
|
||||||
|
DB_PORT = os.getenv('DB_PORT', '5432')
|
||||||
|
DB_NAME = os.getenv('DB_NAME', 'bodytrack')
|
||||||
|
DB_USER = os.getenv('DB_USER', 'bodytrack')
|
||||||
|
DB_PASS = os.getenv('DB_PASSWORD', '')
|
||||||
|
|
||||||
|
SEED_DATA = [
|
||||||
|
{
|
||||||
|
'type_key': 'weight',
|
||||||
|
'label_de': 'Gewicht',
|
||||||
|
'label_en': 'Weight',
|
||||||
|
'unit': 'kg',
|
||||||
|
'icon': '⚖️',
|
||||||
|
'category': 'body',
|
||||||
|
'source_table': 'weight_log',
|
||||||
|
'source_column': 'weight',
|
||||||
|
'aggregation_method': 'latest',
|
||||||
|
'description': 'Aktuelles Körpergewicht',
|
||||||
|
'is_system': True
|
||||||
|
},
|
||||||
|
{
|
||||||
|
'type_key': 'body_fat',
|
||||||
|
'label_de': 'Körperfett',
|
||||||
|
'label_en': 'Body Fat',
|
||||||
|
'unit': '%',
|
||||||
|
'icon': '📊',
|
||||||
|
'category': 'body',
|
||||||
|
'source_table': 'caliper_log',
|
||||||
|
'source_column': 'body_fat_pct',
|
||||||
|
'aggregation_method': 'latest',
|
||||||
|
'description': 'Körperfettanteil aus Caliper-Messung',
|
||||||
|
'is_system': True
|
||||||
|
},
|
||||||
|
{
|
||||||
|
'type_key': 'lean_mass',
|
||||||
|
'label_de': 'Muskelmasse',
|
||||||
|
'label_en': 'Lean Mass',
|
||||||
|
'unit': 'kg',
|
||||||
|
'icon': '💪',
|
||||||
|
'category': 'body',
|
||||||
|
'calculation_formula': '{"type": "lean_mass", "dependencies": ["weight_log.weight", "caliper_log.body_fat_pct"], "formula": "weight - (weight * body_fat_pct / 100)"}',
|
||||||
|
'description': 'Fettfreie Körpermasse (berechnet aus Gewicht und Körperfett)',
|
||||||
|
'is_system': True
|
||||||
|
},
|
||||||
|
{
|
||||||
|
'type_key': 'vo2max',
|
||||||
|
'label_de': 'VO2Max',
|
||||||
|
'label_en': 'VO2Max',
|
||||||
|
'unit': 'ml/kg/min',
|
||||||
|
'icon': '🫁',
|
||||||
|
'category': 'recovery',
|
||||||
|
'source_table': 'vitals_baseline',
|
||||||
|
'source_column': 'vo2_max',
|
||||||
|
'aggregation_method': 'latest',
|
||||||
|
'description': 'Maximale Sauerstoffaufnahme (geschätzt oder gemessen)',
|
||||||
|
'is_system': True
|
||||||
|
},
|
||||||
|
{
|
||||||
|
'type_key': 'rhr',
|
||||||
|
'label_de': 'Ruhepuls',
|
||||||
|
'label_en': 'Resting Heart Rate',
|
||||||
|
'unit': 'bpm',
|
||||||
|
'icon': '💓',
|
||||||
|
'category': 'recovery',
|
||||||
|
'source_table': 'vitals_baseline',
|
||||||
|
'source_column': 'resting_hr',
|
||||||
|
'aggregation_method': 'latest',
|
||||||
|
'description': 'Ruhepuls morgens vor dem Aufstehen',
|
||||||
|
'is_system': True
|
||||||
|
},
|
||||||
|
{
|
||||||
|
'type_key': 'bp',
|
||||||
|
'label_de': 'Blutdruck',
|
||||||
|
'label_en': 'Blood Pressure',
|
||||||
|
'unit': 'mmHg',
|
||||||
|
'icon': '❤️',
|
||||||
|
'category': 'recovery',
|
||||||
|
'source_table': 'blood_pressure_log',
|
||||||
|
'source_column': 'systolic',
|
||||||
|
'aggregation_method': 'latest',
|
||||||
|
'description': 'Blutdruck (aktuell nur systolisch, v2.0: beide Werte)',
|
||||||
|
'is_system': True
|
||||||
|
},
|
||||||
|
{
|
||||||
|
'type_key': 'strength',
|
||||||
|
'label_de': 'Kraft',
|
||||||
|
'label_en': 'Strength',
|
||||||
|
'unit': 'kg',
|
||||||
|
'icon': '🏋️',
|
||||||
|
'category': 'activity',
|
||||||
|
'description': 'Maximalkraft (Platzhalter, Datenquelle in v2.0)',
|
||||||
|
'is_system': True,
|
||||||
|
'is_active': False
|
||||||
|
},
|
||||||
|
{
|
||||||
|
'type_key': 'flexibility',
|
||||||
|
'label_de': 'Beweglichkeit',
|
||||||
|
'label_en': 'Flexibility',
|
||||||
|
'unit': 'cm',
|
||||||
|
'icon': '🤸',
|
||||||
|
'category': 'activity',
|
||||||
|
'description': 'Beweglichkeit (Platzhalter, Datenquelle in v2.0)',
|
||||||
|
'is_system': True,
|
||||||
|
'is_active': False
|
||||||
|
}
|
||||||
|
]
|
||||||
|
|
||||||
|
def main():
|
||||||
|
print("=" * 70)
|
||||||
|
print("Goal Type Definitions - Seed Data Fix")
|
||||||
|
print("=" * 70)
|
||||||
|
|
||||||
|
# Connect to database
|
||||||
|
conn = psycopg2.connect(
|
||||||
|
host=DB_HOST,
|
||||||
|
port=DB_PORT,
|
||||||
|
dbname=DB_NAME,
|
||||||
|
user=DB_USER,
|
||||||
|
password=DB_PASS
|
||||||
|
)
|
||||||
|
conn.autocommit = False
|
||||||
|
cur = conn.cursor(cursor_factory=RealDictCursor)
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Check current state
|
||||||
|
cur.execute("SELECT COUNT(*) as count FROM goal_type_definitions")
|
||||||
|
before_count = cur.fetchone()['count']
|
||||||
|
print(f"\nBefore: {before_count} goal types in database")
|
||||||
|
|
||||||
|
# Insert seed data
|
||||||
|
print(f"\nInserting {len(SEED_DATA)} standard goal types...")
|
||||||
|
inserted = 0
|
||||||
|
skipped = 0
|
||||||
|
|
||||||
|
for data in SEED_DATA:
|
||||||
|
columns = list(data.keys())
|
||||||
|
values = [data[col] for col in columns]
|
||||||
|
placeholders = ', '.join(['%s'] * len(values))
|
||||||
|
cols_str = ', '.join(columns)
|
||||||
|
|
||||||
|
sql = f"""
|
||||||
|
INSERT INTO goal_type_definitions ({cols_str})
|
||||||
|
VALUES ({placeholders})
|
||||||
|
ON CONFLICT (type_key) DO NOTHING
|
||||||
|
RETURNING id
|
||||||
|
"""
|
||||||
|
|
||||||
|
cur.execute(sql, values)
|
||||||
|
result = cur.fetchone()
|
||||||
|
|
||||||
|
if result:
|
||||||
|
inserted += 1
|
||||||
|
print(f" ✓ {data['type_key']}: {data['label_de']}")
|
||||||
|
else:
|
||||||
|
skipped += 1
|
||||||
|
print(f" - {data['type_key']}: already exists (skipped)")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
# Check final state
|
||||||
|
cur.execute("SELECT COUNT(*) as count FROM goal_type_definitions")
|
||||||
|
after_count = cur.fetchone()['count']
|
||||||
|
|
||||||
|
print(f"\nAfter: {after_count} goal types in database")
|
||||||
|
print(f" Inserted: {inserted}")
|
||||||
|
print(f" Skipped: {skipped}")
|
||||||
|
|
||||||
|
# Show summary
|
||||||
|
cur.execute("""
|
||||||
|
SELECT type_key, label_de, is_active, is_system
|
||||||
|
FROM goal_type_definitions
|
||||||
|
ORDER BY is_system DESC, type_key
|
||||||
|
""")
|
||||||
|
|
||||||
|
print("\n" + "=" * 70)
|
||||||
|
print("Current Goal Types:")
|
||||||
|
print("=" * 70)
|
||||||
|
print(f"\n{'Type Key':<20} {'Label':<20} {'System':<8} {'Active':<8}")
|
||||||
|
print("-" * 70)
|
||||||
|
|
||||||
|
for row in cur.fetchall():
|
||||||
|
status = "YES" if row['is_system'] else "NO"
|
||||||
|
active = "YES" if row['is_active'] else "NO"
|
||||||
|
print(f"{row['type_key']:<20} {row['label_de']:<20} {status:<8} {active:<8}")
|
||||||
|
|
||||||
|
print("\n✅ DONE! Goal types seeded successfully.")
|
||||||
|
print("\nNext step: Reload frontend to see the changes.")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
conn.rollback()
|
||||||
|
print(f"\n❌ Error: {e}")
|
||||||
|
import traceback
|
||||||
|
traceback.print_exc()
|
||||||
|
finally:
|
||||||
|
cur.close()
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
||||||
518
backend/goal_utils.py
Normal file
518
backend/goal_utils.py
Normal file
|
|
@ -0,0 +1,518 @@
|
||||||
|
"""
|
||||||
|
Goal Utilities - Abstraction Layer for Focus Weights & Universal Value Fetcher
|
||||||
|
|
||||||
|
This module provides:
|
||||||
|
1. Abstraction layer between goal modes and focus weights (Phase 1)
|
||||||
|
2. Universal value fetcher for dynamic goal types (Phase 1.5)
|
||||||
|
|
||||||
|
Version History:
|
||||||
|
- V1 (Phase 1): Maps goal_mode to predefined weights
|
||||||
|
- V1.5 (Phase 1.5): Universal value fetcher for DB-registry goal types
|
||||||
|
- V2 (future): Reads from focus_areas table with custom user weights
|
||||||
|
|
||||||
|
Part of Phase 1 + Phase 1.5: Flexible Goal System
|
||||||
|
"""
|
||||||
|
|
||||||
|
from typing import Dict, Optional, Any
|
||||||
|
from datetime import date, timedelta
|
||||||
|
from decimal import Decimal
|
||||||
|
import json
|
||||||
|
from db import get_cursor
|
||||||
|
|
||||||
|
|
||||||
|
def get_focus_weights(conn, profile_id: str) -> Dict[str, float]:
|
||||||
|
"""
|
||||||
|
Get focus area weights for a profile.
|
||||||
|
|
||||||
|
V2 (Goal System v2.0): Reads from focus_areas table with custom user weights.
|
||||||
|
Falls back to goal_mode mapping if focus_areas not set.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
conn: Database connection
|
||||||
|
profile_id: User's profile ID
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with focus weights (sum = 1.0):
|
||||||
|
{
|
||||||
|
'weight_loss': 0.3, # Fat loss priority
|
||||||
|
'muscle_gain': 0.2, # Muscle gain priority
|
||||||
|
'strength': 0.25, # Strength training priority
|
||||||
|
'endurance': 0.25, # Cardio/endurance priority
|
||||||
|
'flexibility': 0.0, # Mobility priority
|
||||||
|
'health': 0.0 # General health maintenance
|
||||||
|
}
|
||||||
|
|
||||||
|
Example Usage in Phase 0b:
|
||||||
|
weights = get_focus_weights(conn, profile_id)
|
||||||
|
|
||||||
|
# Score calculation considers user's focus
|
||||||
|
overall_score = (
|
||||||
|
body_score * weights['weight_loss'] +
|
||||||
|
strength_score * weights['strength'] +
|
||||||
|
cardio_score * weights['endurance']
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# V2: Try to fetch from focus_areas table
|
||||||
|
cur.execute("""
|
||||||
|
SELECT weight_loss_pct, muscle_gain_pct, strength_pct,
|
||||||
|
endurance_pct, flexibility_pct, health_pct
|
||||||
|
FROM focus_areas
|
||||||
|
WHERE profile_id = %s AND active = true
|
||||||
|
LIMIT 1
|
||||||
|
""", (profile_id,))
|
||||||
|
|
||||||
|
row = cur.fetchone()
|
||||||
|
|
||||||
|
if row:
|
||||||
|
# Convert percentages to weights (0-1 range)
|
||||||
|
return {
|
||||||
|
'weight_loss': row['weight_loss_pct'] / 100.0,
|
||||||
|
'muscle_gain': row['muscle_gain_pct'] / 100.0,
|
||||||
|
'strength': row['strength_pct'] / 100.0,
|
||||||
|
'endurance': row['endurance_pct'] / 100.0,
|
||||||
|
'flexibility': row['flexibility_pct'] / 100.0,
|
||||||
|
'health': row['health_pct'] / 100.0
|
||||||
|
}
|
||||||
|
|
||||||
|
# V1 Fallback: Use goal_mode if focus_areas not set
|
||||||
|
cur.execute(
|
||||||
|
"SELECT goal_mode FROM profiles WHERE id = %s",
|
||||||
|
(profile_id,)
|
||||||
|
)
|
||||||
|
row = cur.fetchone()
|
||||||
|
|
||||||
|
if not row:
|
||||||
|
# Ultimate fallback: balanced health focus
|
||||||
|
return {
|
||||||
|
'weight_loss': 0.0,
|
||||||
|
'muscle_gain': 0.0,
|
||||||
|
'strength': 0.10,
|
||||||
|
'endurance': 0.20,
|
||||||
|
'flexibility': 0.15,
|
||||||
|
'health': 0.55
|
||||||
|
}
|
||||||
|
|
||||||
|
goal_mode = row['goal_mode']
|
||||||
|
|
||||||
|
if not goal_mode:
|
||||||
|
return {
|
||||||
|
'weight_loss': 0.0,
|
||||||
|
'muscle_gain': 0.0,
|
||||||
|
'strength': 0.10,
|
||||||
|
'endurance': 0.20,
|
||||||
|
'flexibility': 0.15,
|
||||||
|
'health': 0.55
|
||||||
|
}
|
||||||
|
|
||||||
|
# V1: Predefined weight mappings per goal_mode (fallback)
|
||||||
|
WEIGHT_MAPPINGS = {
|
||||||
|
'weight_loss': {
|
||||||
|
'weight_loss': 0.60,
|
||||||
|
'endurance': 0.20,
|
||||||
|
'muscle_gain': 0.0,
|
||||||
|
'strength': 0.10,
|
||||||
|
'flexibility': 0.05,
|
||||||
|
'health': 0.05
|
||||||
|
},
|
||||||
|
'strength': {
|
||||||
|
'strength': 0.50,
|
||||||
|
'muscle_gain': 0.40,
|
||||||
|
'endurance': 0.10,
|
||||||
|
'weight_loss': 0.0,
|
||||||
|
'flexibility': 0.0,
|
||||||
|
'health': 0.0
|
||||||
|
},
|
||||||
|
'endurance': {
|
||||||
|
'endurance': 0.70,
|
||||||
|
'health': 0.20,
|
||||||
|
'flexibility': 0.10,
|
||||||
|
'weight_loss': 0.0,
|
||||||
|
'muscle_gain': 0.0,
|
||||||
|
'strength': 0.0
|
||||||
|
},
|
||||||
|
'recomposition': {
|
||||||
|
'weight_loss': 0.30,
|
||||||
|
'muscle_gain': 0.30,
|
||||||
|
'strength': 0.25,
|
||||||
|
'endurance': 0.10,
|
||||||
|
'flexibility': 0.05,
|
||||||
|
'health': 0.0
|
||||||
|
},
|
||||||
|
'health': {
|
||||||
|
'health': 0.50,
|
||||||
|
'endurance': 0.20,
|
||||||
|
'flexibility': 0.15,
|
||||||
|
'strength': 0.10,
|
||||||
|
'weight_loss': 0.05,
|
||||||
|
'muscle_gain': 0.0
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return WEIGHT_MAPPINGS.get(goal_mode, WEIGHT_MAPPINGS['health'])
|
||||||
|
|
||||||
|
|
||||||
|
def get_primary_focus(conn, profile_id: str) -> str:
|
||||||
|
"""
|
||||||
|
Get the primary focus area for a profile.
|
||||||
|
|
||||||
|
Returns the focus area with the highest weight.
|
||||||
|
Useful for UI labels and simple decision logic.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
conn: Database connection
|
||||||
|
profile_id: User's profile ID
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Primary focus area name (e.g., 'weight_loss', 'strength')
|
||||||
|
"""
|
||||||
|
weights = get_focus_weights(conn, profile_id)
|
||||||
|
return max(weights.items(), key=lambda x: x[1])[0]
|
||||||
|
|
||||||
|
|
||||||
|
def get_focus_description(focus_area: str) -> str:
|
||||||
|
"""
|
||||||
|
Get human-readable description for a focus area.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
focus_area: Focus area key (e.g., 'weight_loss')
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
German description for UI display
|
||||||
|
"""
|
||||||
|
descriptions = {
|
||||||
|
'weight_loss': 'Gewichtsreduktion & Fettabbau',
|
||||||
|
'muscle_gain': 'Muskelaufbau & Hypertrophie',
|
||||||
|
'strength': 'Kraftsteigerung & Performance',
|
||||||
|
'endurance': 'Ausdauer & aerobe Kapazität',
|
||||||
|
'flexibility': 'Beweglichkeit & Mobilität',
|
||||||
|
'health': 'Allgemeine Gesundheit & Erhaltung'
|
||||||
|
}
|
||||||
|
return descriptions.get(focus_area, focus_area)
|
||||||
|
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Phase 1.5: Universal Value Fetcher for Dynamic Goal Types
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
def get_goal_type_config(conn, type_key: str) -> Optional[Dict[str, Any]]:
|
||||||
|
"""
|
||||||
|
Get goal type configuration from database registry.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
conn: Database connection
|
||||||
|
type_key: Goal type key (e.g., 'weight', 'meditation_minutes')
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with config or None if not found/inactive
|
||||||
|
"""
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
cur.execute("""
|
||||||
|
SELECT type_key, source_table, source_column, aggregation_method,
|
||||||
|
calculation_formula, filter_conditions, label_de, unit, icon, category
|
||||||
|
FROM goal_type_definitions
|
||||||
|
WHERE type_key = %s AND is_active = true
|
||||||
|
LIMIT 1
|
||||||
|
""", (type_key,))
|
||||||
|
|
||||||
|
return cur.fetchone()
|
||||||
|
|
||||||
|
|
||||||
|
def get_current_value_for_goal(conn, profile_id: str, goal_type: str) -> Optional[float]:
|
||||||
|
"""
|
||||||
|
Universal value fetcher for any goal type.
|
||||||
|
|
||||||
|
Reads configuration from goal_type_definitions table and executes
|
||||||
|
appropriate query based on aggregation_method or calculation_formula.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
conn: Database connection
|
||||||
|
profile_id: User's profile ID
|
||||||
|
goal_type: Goal type key (e.g., 'weight', 'meditation_minutes')
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Current value as float or None if not available
|
||||||
|
"""
|
||||||
|
config = get_goal_type_config(conn, goal_type)
|
||||||
|
|
||||||
|
if not config:
|
||||||
|
print(f"[WARNING] Goal type '{goal_type}' not found or inactive")
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Complex calculation (e.g., lean_mass)
|
||||||
|
if config['calculation_formula']:
|
||||||
|
return _execute_calculation_formula(conn, profile_id, config['calculation_formula'])
|
||||||
|
|
||||||
|
# Simple aggregation
|
||||||
|
return _fetch_by_aggregation_method(
|
||||||
|
conn,
|
||||||
|
profile_id,
|
||||||
|
config['source_table'],
|
||||||
|
config['source_column'],
|
||||||
|
config['aggregation_method'],
|
||||||
|
config.get('filter_conditions')
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _fetch_by_aggregation_method(
|
||||||
|
conn,
|
||||||
|
profile_id: str,
|
||||||
|
table: str,
|
||||||
|
column: str,
|
||||||
|
method: str,
|
||||||
|
filter_conditions: Optional[Any] = None
|
||||||
|
) -> Optional[float]:
|
||||||
|
"""
|
||||||
|
Fetch value using specified aggregation method.
|
||||||
|
|
||||||
|
Supported methods:
|
||||||
|
- latest: Most recent value
|
||||||
|
- avg_7d: 7-day average
|
||||||
|
- avg_30d: 30-day average
|
||||||
|
- sum_30d: 30-day sum
|
||||||
|
- count_7d: Count of entries in last 7 days
|
||||||
|
- count_30d: Count of entries in last 30 days
|
||||||
|
- min_30d: Minimum value in last 30 days
|
||||||
|
- max_30d: Maximum value in last 30 days
|
||||||
|
|
||||||
|
Args:
|
||||||
|
filter_conditions: Optional JSON filters (e.g., {"training_category": "strength"})
|
||||||
|
"""
|
||||||
|
# Guard: source_table/column required for simple aggregation
|
||||||
|
if not table or not column:
|
||||||
|
print(f"[WARNING] Missing source_table or source_column for aggregation")
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Table-specific date column mapping (some tables use different column names)
|
||||||
|
DATE_COLUMN_MAP = {
|
||||||
|
'blood_pressure_log': 'measured_at',
|
||||||
|
'activity_log': 'date',
|
||||||
|
'weight_log': 'date',
|
||||||
|
'circumference_log': 'date',
|
||||||
|
'caliper_log': 'date',
|
||||||
|
'nutrition_log': 'date',
|
||||||
|
'sleep_log': 'date',
|
||||||
|
'vitals_baseline': 'date',
|
||||||
|
'rest_days': 'date',
|
||||||
|
'fitness_tests': 'test_date'
|
||||||
|
}
|
||||||
|
date_col = DATE_COLUMN_MAP.get(table, 'date')
|
||||||
|
|
||||||
|
# Build filter SQL from JSON conditions
|
||||||
|
filter_sql = ""
|
||||||
|
filter_params = []
|
||||||
|
|
||||||
|
if filter_conditions:
|
||||||
|
try:
|
||||||
|
if isinstance(filter_conditions, str):
|
||||||
|
filters = json.loads(filter_conditions)
|
||||||
|
else:
|
||||||
|
filters = filter_conditions
|
||||||
|
|
||||||
|
for filter_col, filter_val in filters.items():
|
||||||
|
if isinstance(filter_val, list):
|
||||||
|
# IN clause for multiple values
|
||||||
|
placeholders = ', '.join(['%s'] * len(filter_val))
|
||||||
|
filter_sql += f" AND {filter_col} IN ({placeholders})"
|
||||||
|
filter_params.extend(filter_val)
|
||||||
|
else:
|
||||||
|
# Single value equality
|
||||||
|
filter_sql += f" AND {filter_col} = %s"
|
||||||
|
filter_params.append(filter_val)
|
||||||
|
except (json.JSONDecodeError, TypeError, AttributeError) as e:
|
||||||
|
print(f"[WARNING] Invalid filter_conditions: {e}, ignoring filters")
|
||||||
|
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
try:
|
||||||
|
if method == 'latest':
|
||||||
|
params = [profile_id] + filter_params
|
||||||
|
cur.execute(f"""
|
||||||
|
SELECT {column} FROM {table}
|
||||||
|
WHERE profile_id = %s AND {column} IS NOT NULL{filter_sql}
|
||||||
|
ORDER BY {date_col} DESC LIMIT 1
|
||||||
|
""", params)
|
||||||
|
row = cur.fetchone()
|
||||||
|
return float(row[column]) if row else None
|
||||||
|
|
||||||
|
elif method == 'avg_7d':
|
||||||
|
days_ago = date.today() - timedelta(days=7)
|
||||||
|
params = [profile_id, days_ago] + filter_params
|
||||||
|
cur.execute(f"""
|
||||||
|
SELECT AVG({column}) as avg_value FROM {table}
|
||||||
|
WHERE profile_id = %s AND {date_col} >= %s AND {column} IS NOT NULL{filter_sql}
|
||||||
|
""", params)
|
||||||
|
row = cur.fetchone()
|
||||||
|
return float(row['avg_value']) if row and row['avg_value'] is not None else None
|
||||||
|
|
||||||
|
elif method == 'avg_30d':
|
||||||
|
days_ago = date.today() - timedelta(days=30)
|
||||||
|
params = [profile_id, days_ago] + filter_params
|
||||||
|
cur.execute(f"""
|
||||||
|
SELECT AVG({column}) as avg_value FROM {table}
|
||||||
|
WHERE profile_id = %s AND {date_col} >= %s AND {column} IS NOT NULL{filter_sql}
|
||||||
|
""", params)
|
||||||
|
row = cur.fetchone()
|
||||||
|
return float(row['avg_value']) if row and row['avg_value'] is not None else None
|
||||||
|
|
||||||
|
elif method == 'sum_30d':
|
||||||
|
days_ago = date.today() - timedelta(days=30)
|
||||||
|
params = [profile_id, days_ago] + filter_params
|
||||||
|
cur.execute(f"""
|
||||||
|
SELECT SUM({column}) as sum_value FROM {table}
|
||||||
|
WHERE profile_id = %s AND {date_col} >= %s AND {column} IS NOT NULL{filter_sql}
|
||||||
|
""", params)
|
||||||
|
row = cur.fetchone()
|
||||||
|
return float(row['sum_value']) if row and row['sum_value'] is not None else None
|
||||||
|
|
||||||
|
elif method == 'count_7d':
|
||||||
|
days_ago = date.today() - timedelta(days=7)
|
||||||
|
params = [profile_id, days_ago] + filter_params
|
||||||
|
cur.execute(f"""
|
||||||
|
SELECT COUNT(*) as count_value FROM {table}
|
||||||
|
WHERE profile_id = %s AND {date_col} >= %s{filter_sql}
|
||||||
|
""", params)
|
||||||
|
row = cur.fetchone()
|
||||||
|
return float(row['count_value']) if row else 0.0
|
||||||
|
|
||||||
|
elif method == 'count_30d':
|
||||||
|
days_ago = date.today() - timedelta(days=30)
|
||||||
|
params = [profile_id, days_ago] + filter_params
|
||||||
|
cur.execute(f"""
|
||||||
|
SELECT COUNT(*) as count_value FROM {table}
|
||||||
|
WHERE profile_id = %s AND {date_col} >= %s{filter_sql}
|
||||||
|
""", params)
|
||||||
|
row = cur.fetchone()
|
||||||
|
return float(row['count_value']) if row else 0.0
|
||||||
|
|
||||||
|
elif method == 'min_30d':
|
||||||
|
days_ago = date.today() - timedelta(days=30)
|
||||||
|
params = [profile_id, days_ago] + filter_params
|
||||||
|
cur.execute(f"""
|
||||||
|
SELECT MIN({column}) as min_value FROM {table}
|
||||||
|
WHERE profile_id = %s AND {date_col} >= %s AND {column} IS NOT NULL{filter_sql}
|
||||||
|
""", params)
|
||||||
|
row = cur.fetchone()
|
||||||
|
return float(row['min_value']) if row and row['min_value'] is not None else None
|
||||||
|
|
||||||
|
elif method == 'max_30d':
|
||||||
|
days_ago = date.today() - timedelta(days=30)
|
||||||
|
params = [profile_id, days_ago] + filter_params
|
||||||
|
cur.execute(f"""
|
||||||
|
SELECT MAX({column}) as max_value FROM {table}
|
||||||
|
WHERE profile_id = %s AND {date_col} >= %s AND {column} IS NOT NULL{filter_sql}
|
||||||
|
""", params)
|
||||||
|
row = cur.fetchone()
|
||||||
|
return float(row['max_value']) if row and row['max_value'] is not None else None
|
||||||
|
|
||||||
|
else:
|
||||||
|
print(f"[WARNING] Unknown aggregation method: {method}")
|
||||||
|
return None
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
# Log detailed error for debugging
|
||||||
|
print(f"[ERROR] Failed to fetch value from {table}.{column} using {method}: {e}")
|
||||||
|
print(f"[ERROR] Filter conditions: {filter_conditions}")
|
||||||
|
print(f"[ERROR] Filter SQL: {filter_sql}")
|
||||||
|
print(f"[ERROR] Filter params: {filter_params}")
|
||||||
|
|
||||||
|
# CRITICAL: Rollback transaction to avoid InFailedSqlTransaction errors
|
||||||
|
try:
|
||||||
|
conn.rollback()
|
||||||
|
print(f"[INFO] Transaction rolled back after query error")
|
||||||
|
except Exception as rollback_err:
|
||||||
|
print(f"[WARNING] Rollback failed: {rollback_err}")
|
||||||
|
|
||||||
|
# Return None so goal creation can continue without current_value
|
||||||
|
# (current_value will be NULL in the goal record)
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def _execute_calculation_formula(conn, profile_id: str, formula_json: str) -> Optional[float]:
|
||||||
|
"""
|
||||||
|
Execute complex calculation formula.
|
||||||
|
|
||||||
|
Currently supports:
|
||||||
|
- lean_mass: weight - (weight * body_fat_pct / 100)
|
||||||
|
|
||||||
|
Future: Parse JSON formula and execute dynamically.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
conn: Database connection
|
||||||
|
profile_id: User's profile ID
|
||||||
|
formula_json: JSON string with calculation config
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Calculated value or None
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
formula = json.loads(formula_json)
|
||||||
|
calc_type = formula.get('type')
|
||||||
|
|
||||||
|
if calc_type == 'lean_mass':
|
||||||
|
# Get dependencies
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
cur.execute("""
|
||||||
|
SELECT weight FROM weight_log
|
||||||
|
WHERE profile_id = %s
|
||||||
|
ORDER BY date DESC LIMIT 1
|
||||||
|
""", (profile_id,))
|
||||||
|
weight_row = cur.fetchone()
|
||||||
|
|
||||||
|
cur.execute("""
|
||||||
|
SELECT body_fat_pct FROM caliper_log
|
||||||
|
WHERE profile_id = %s
|
||||||
|
ORDER BY date DESC LIMIT 1
|
||||||
|
""", (profile_id,))
|
||||||
|
bf_row = cur.fetchone()
|
||||||
|
|
||||||
|
if weight_row and bf_row:
|
||||||
|
weight = float(weight_row['weight'])
|
||||||
|
bf_pct = float(bf_row['body_fat_pct'])
|
||||||
|
lean_mass = weight - (weight * bf_pct / 100.0)
|
||||||
|
return round(lean_mass, 2)
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
else:
|
||||||
|
print(f"[WARNING] Unknown calculation type: {calc_type}")
|
||||||
|
return None
|
||||||
|
|
||||||
|
except (json.JSONDecodeError, KeyError, ValueError, TypeError) as e:
|
||||||
|
print(f"[ERROR] Formula execution failed: {e}, formula={formula_json}")
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
# Future V2 Implementation (commented out for reference):
|
||||||
|
"""
|
||||||
|
def get_focus_weights_v2(conn, profile_id: str) -> Dict[str, float]:
|
||||||
|
'''V2: Read from focus_areas table with custom user weights'''
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
cur.execute('''
|
||||||
|
SELECT weight_loss_pct, muscle_gain_pct, endurance_pct,
|
||||||
|
strength_pct, flexibility_pct, health_pct
|
||||||
|
FROM focus_areas
|
||||||
|
WHERE profile_id = %s AND active = true
|
||||||
|
LIMIT 1
|
||||||
|
''', (profile_id,))
|
||||||
|
|
||||||
|
row = cur.fetchone()
|
||||||
|
|
||||||
|
if not row:
|
||||||
|
# Fallback to V1 behavior
|
||||||
|
return get_focus_weights(conn, profile_id)
|
||||||
|
|
||||||
|
# Convert percentages to weights (0-1 range)
|
||||||
|
return {
|
||||||
|
'weight_loss': row['weight_loss_pct'] / 100.0,
|
||||||
|
'muscle_gain': row['muscle_gain_pct'] / 100.0,
|
||||||
|
'endurance': row['endurance_pct'] / 100.0,
|
||||||
|
'strength': row['strength_pct'] / 100.0,
|
||||||
|
'flexibility': row['flexibility_pct'] / 100.0,
|
||||||
|
'health': row['health_pct'] / 100.0
|
||||||
|
}
|
||||||
|
"""
|
||||||
1920
backend/main.py
1920
backend/main.py
File diff suppressed because it is too large
Load Diff
1878
backend/main_old.py
Normal file
1878
backend/main_old.py
Normal file
File diff suppressed because it is too large
Load Diff
25
backend/migrations/003_add_email_verification.sql
Normal file
25
backend/migrations/003_add_email_verification.sql
Normal file
|
|
@ -0,0 +1,25 @@
|
||||||
|
-- ================================================================
|
||||||
|
-- Migration 003: Add Email Verification Fields
|
||||||
|
-- Version: v9c
|
||||||
|
-- Date: 2026-03-21
|
||||||
|
-- ================================================================
|
||||||
|
|
||||||
|
-- Add email verification columns to profiles table
|
||||||
|
ALTER TABLE profiles
|
||||||
|
ADD COLUMN IF NOT EXISTS email_verified BOOLEAN DEFAULT FALSE,
|
||||||
|
ADD COLUMN IF NOT EXISTS verification_token TEXT,
|
||||||
|
ADD COLUMN IF NOT EXISTS verification_expires TIMESTAMP WITH TIME ZONE;
|
||||||
|
|
||||||
|
-- Create index for verification token lookups
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_profiles_verification_token
|
||||||
|
ON profiles(verification_token)
|
||||||
|
WHERE verification_token IS NOT NULL;
|
||||||
|
|
||||||
|
-- Mark existing users with email as verified (grandfather clause)
|
||||||
|
UPDATE profiles
|
||||||
|
SET email_verified = TRUE
|
||||||
|
WHERE email IS NOT NULL AND email_verified IS NULL;
|
||||||
|
|
||||||
|
COMMENT ON COLUMN profiles.email_verified IS 'Whether email address has been verified';
|
||||||
|
COMMENT ON COLUMN profiles.verification_token IS 'One-time token for email verification';
|
||||||
|
COMMENT ON COLUMN profiles.verification_expires IS 'Verification token expiry (24h from creation)';
|
||||||
86
backend/migrations/004_training_types.sql
Normal file
86
backend/migrations/004_training_types.sql
Normal file
|
|
@ -0,0 +1,86 @@
|
||||||
|
-- Migration 004: Training Types & Categories
|
||||||
|
-- Part of v9d: Schlaf + Sport-Vertiefung
|
||||||
|
-- Created: 2026-03-21
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- 1. Create training_types table
|
||||||
|
-- ========================================
|
||||||
|
CREATE TABLE IF NOT EXISTS training_types (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
category VARCHAR(50) NOT NULL, -- Main category: 'cardio', 'strength', 'hiit', etc.
|
||||||
|
subcategory VARCHAR(50), -- Optional: 'running', 'hypertrophy', etc.
|
||||||
|
name_de VARCHAR(100) NOT NULL, -- German display name
|
||||||
|
name_en VARCHAR(100) NOT NULL, -- English display name
|
||||||
|
icon VARCHAR(10), -- Emoji icon
|
||||||
|
sort_order INTEGER DEFAULT 0, -- For UI ordering
|
||||||
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- 2. Add training type columns to activity_log
|
||||||
|
-- ========================================
|
||||||
|
ALTER TABLE activity_log
|
||||||
|
ADD COLUMN IF NOT EXISTS training_type_id INTEGER REFERENCES training_types(id),
|
||||||
|
ADD COLUMN IF NOT EXISTS training_category VARCHAR(50), -- Denormalized for fast queries
|
||||||
|
ADD COLUMN IF NOT EXISTS training_subcategory VARCHAR(50); -- Denormalized
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- 3. Create indexes
|
||||||
|
-- ========================================
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_activity_training_type ON activity_log(training_type_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_activity_training_category ON activity_log(training_category);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_training_types_category ON training_types(category);
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- 4. Seed training types data
|
||||||
|
-- ========================================
|
||||||
|
|
||||||
|
-- Cardio (Ausdauer)
|
||||||
|
INSERT INTO training_types (category, subcategory, name_de, name_en, icon, sort_order) VALUES
|
||||||
|
('cardio', 'running', 'Laufen', 'Running', '🏃', 100),
|
||||||
|
('cardio', 'cycling', 'Radfahren', 'Cycling', '🚴', 101),
|
||||||
|
('cardio', 'swimming', 'Schwimmen', 'Swimming', '🏊', 102),
|
||||||
|
('cardio', 'rowing', 'Rudern', 'Rowing', '🚣', 103),
|
||||||
|
('cardio', 'other', 'Sonstiges Cardio', 'Other Cardio', '❤️', 104);
|
||||||
|
|
||||||
|
-- Kraft
|
||||||
|
INSERT INTO training_types (category, subcategory, name_de, name_en, icon, sort_order) VALUES
|
||||||
|
('strength', 'hypertrophy', 'Hypertrophie', 'Hypertrophy', '💪', 200),
|
||||||
|
('strength', 'maxstrength', 'Maximalkraft', 'Max Strength', '🏋️', 201),
|
||||||
|
('strength', 'endurance', 'Kraftausdauer', 'Strength Endurance', '🔁', 202),
|
||||||
|
('strength', 'functional', 'Funktionell', 'Functional', '⚡', 203);
|
||||||
|
|
||||||
|
-- Schnellkraft / HIIT
|
||||||
|
INSERT INTO training_types (category, subcategory, name_de, name_en, icon, sort_order) VALUES
|
||||||
|
('hiit', 'hiit', 'HIIT', 'HIIT', '🔥', 300),
|
||||||
|
('hiit', 'explosive', 'Explosiv', 'Explosive', '💥', 301),
|
||||||
|
('hiit', 'circuit', 'Circuit Training', 'Circuit Training', '🔄', 302);
|
||||||
|
|
||||||
|
-- Kampfsport / Technikkraft
|
||||||
|
INSERT INTO training_types (category, subcategory, name_de, name_en, icon, sort_order) VALUES
|
||||||
|
('martial_arts', 'technique', 'Techniktraining', 'Technique Training', '🥋', 400),
|
||||||
|
('martial_arts', 'sparring', 'Sparring / Wettkampf', 'Sparring / Competition', '🥊', 401),
|
||||||
|
('martial_arts', 'strength', 'Kraft für Kampfsport', 'Martial Arts Strength', '⚔️', 402);
|
||||||
|
|
||||||
|
-- Mobility & Dehnung
|
||||||
|
INSERT INTO training_types (category, subcategory, name_de, name_en, icon, sort_order) VALUES
|
||||||
|
('mobility', 'static', 'Statisches Dehnen', 'Static Stretching', '🧘', 500),
|
||||||
|
('mobility', 'dynamic', 'Dynamisches Dehnen', 'Dynamic Stretching', '🤸', 501),
|
||||||
|
('mobility', 'yoga', 'Yoga', 'Yoga', '🕉️', 502),
|
||||||
|
('mobility', 'fascia', 'Faszienarbeit', 'Fascia Work', '🎯', 503);
|
||||||
|
|
||||||
|
-- Erholung (aktiv)
|
||||||
|
INSERT INTO training_types (category, subcategory, name_de, name_en, icon, sort_order) VALUES
|
||||||
|
('recovery', 'walk', 'Spaziergang', 'Walk', '🚶', 600),
|
||||||
|
('recovery', 'swim_light', 'Leichtes Schwimmen', 'Light Swimming', '🏊', 601),
|
||||||
|
('recovery', 'regeneration', 'Regenerationseinheit', 'Regeneration', '💆', 602);
|
||||||
|
|
||||||
|
-- General / Uncategorized
|
||||||
|
INSERT INTO training_types (category, subcategory, name_de, name_en, icon, sort_order) VALUES
|
||||||
|
('other', NULL, 'Sonstiges', 'Other', '📝', 900);
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- 5. Add comment
|
||||||
|
-- ========================================
|
||||||
|
COMMENT ON TABLE training_types IS 'v9d: Training type categories and subcategories';
|
||||||
|
COMMENT ON TABLE activity_log IS 'Extended in v9d with training_type_id for categorization';
|
||||||
24
backend/migrations/005_training_types_extended.sql
Normal file
24
backend/migrations/005_training_types_extended.sql
Normal file
|
|
@ -0,0 +1,24 @@
|
||||||
|
-- Migration 005: Extended Training Types
|
||||||
|
-- Add: Cardio (Gehen, Tanzen), Mind & Meditation category
|
||||||
|
-- Created: 2026-03-21
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- Add new cardio subcategories
|
||||||
|
-- ========================================
|
||||||
|
INSERT INTO training_types (category, subcategory, name_de, name_en, icon, sort_order) VALUES
|
||||||
|
('cardio', 'walk', 'Gehen', 'Walking', '🚶', 105),
|
||||||
|
('cardio', 'dance', 'Tanzen', 'Dance', '💃', 106);
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- Add new category: Geist & Meditation
|
||||||
|
-- ========================================
|
||||||
|
INSERT INTO training_types (category, subcategory, name_de, name_en, icon, sort_order) VALUES
|
||||||
|
('mind', 'meditation', 'Meditation', 'Meditation', '🧘♂️', 700),
|
||||||
|
('mind', 'breathwork', 'Atemarbeit', 'Breathwork', '🫁', 701),
|
||||||
|
('mind', 'mindfulness', 'Achtsamkeit', 'Mindfulness', '☮️', 702),
|
||||||
|
('mind', 'visualization', 'Visualisierung', 'Visualization', '🎨', 703);
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- Add comment
|
||||||
|
-- ========================================
|
||||||
|
COMMENT ON TABLE training_types IS 'v9d Phase 1b: Extended with cardio walk/dance and mind category';
|
||||||
29
backend/migrations/006_training_types_abilities.sql
Normal file
29
backend/migrations/006_training_types_abilities.sql
Normal file
|
|
@ -0,0 +1,29 @@
|
||||||
|
-- Migration 006: Training Types - Abilities Mapping
|
||||||
|
-- Add abilities JSONB column for future AI analysis
|
||||||
|
-- Maps to: koordinativ, konditionell, kognitiv, psychisch, taktisch
|
||||||
|
-- Created: 2026-03-21
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- Add abilities column
|
||||||
|
-- ========================================
|
||||||
|
ALTER TABLE training_types
|
||||||
|
ADD COLUMN IF NOT EXISTS abilities JSONB DEFAULT '{}';
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- Add description columns for better documentation
|
||||||
|
-- ========================================
|
||||||
|
ALTER TABLE training_types
|
||||||
|
ADD COLUMN IF NOT EXISTS description_de TEXT,
|
||||||
|
ADD COLUMN IF NOT EXISTS description_en TEXT;
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- Add index for abilities queries
|
||||||
|
-- ========================================
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_training_types_abilities ON training_types USING GIN (abilities);
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- Comment
|
||||||
|
-- ========================================
|
||||||
|
COMMENT ON COLUMN training_types.abilities IS 'JSONB: Maps to athletic abilities for AI analysis (koordinativ, konditionell, kognitiv, psychisch, taktisch)';
|
||||||
|
COMMENT ON COLUMN training_types.description_de IS 'German description for admin UI and AI context';
|
||||||
|
COMMENT ON COLUMN training_types.description_en IS 'English description for admin UI and AI context';
|
||||||
121
backend/migrations/007_activity_type_mappings.sql
Normal file
121
backend/migrations/007_activity_type_mappings.sql
Normal file
|
|
@ -0,0 +1,121 @@
|
||||||
|
-- Migration 007: Activity Type Mappings (Learnable System)
|
||||||
|
-- Replaces hardcoded mappings with DB-based configurable system
|
||||||
|
-- Created: 2026-03-21
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- 1. Create activity_type_mappings table
|
||||||
|
-- ========================================
|
||||||
|
CREATE TABLE IF NOT EXISTS activity_type_mappings (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
activity_type VARCHAR(100) NOT NULL,
|
||||||
|
training_type_id INTEGER NOT NULL REFERENCES training_types(id) ON DELETE CASCADE,
|
||||||
|
profile_id VARCHAR(36), -- NULL = global mapping, otherwise user-specific
|
||||||
|
source VARCHAR(20) DEFAULT 'manual', -- 'manual', 'bulk', 'admin', 'default'
|
||||||
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
CONSTRAINT unique_activity_type_per_profile UNIQUE(activity_type, profile_id)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- 2. Create indexes
|
||||||
|
-- ========================================
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_activity_type_mappings_type ON activity_type_mappings(activity_type);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_activity_type_mappings_profile ON activity_type_mappings(profile_id);
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- 3. Seed default mappings (global)
|
||||||
|
-- ========================================
|
||||||
|
-- Note: These are the German Apple Health workout types
|
||||||
|
-- training_type_id references are based on existing training_types data
|
||||||
|
|
||||||
|
-- Helper function to get training_type_id by subcategory
|
||||||
|
DO $$
|
||||||
|
DECLARE
|
||||||
|
v_running_id INTEGER;
|
||||||
|
v_walk_id INTEGER;
|
||||||
|
v_cycling_id INTEGER;
|
||||||
|
v_swimming_id INTEGER;
|
||||||
|
v_hypertrophy_id INTEGER;
|
||||||
|
v_functional_id INTEGER;
|
||||||
|
v_hiit_id INTEGER;
|
||||||
|
v_yoga_id INTEGER;
|
||||||
|
v_technique_id INTEGER;
|
||||||
|
v_sparring_id INTEGER;
|
||||||
|
v_rowing_id INTEGER;
|
||||||
|
v_dance_id INTEGER;
|
||||||
|
v_static_id INTEGER;
|
||||||
|
v_regeneration_id INTEGER;
|
||||||
|
v_meditation_id INTEGER;
|
||||||
|
v_mindfulness_id INTEGER;
|
||||||
|
BEGIN
|
||||||
|
-- Get training_type IDs
|
||||||
|
SELECT id INTO v_running_id FROM training_types WHERE subcategory = 'running' LIMIT 1;
|
||||||
|
SELECT id INTO v_walk_id FROM training_types WHERE subcategory = 'walk' LIMIT 1;
|
||||||
|
SELECT id INTO v_cycling_id FROM training_types WHERE subcategory = 'cycling' LIMIT 1;
|
||||||
|
SELECT id INTO v_swimming_id FROM training_types WHERE subcategory = 'swimming' LIMIT 1;
|
||||||
|
SELECT id INTO v_hypertrophy_id FROM training_types WHERE subcategory = 'hypertrophy' LIMIT 1;
|
||||||
|
SELECT id INTO v_functional_id FROM training_types WHERE subcategory = 'functional' LIMIT 1;
|
||||||
|
SELECT id INTO v_hiit_id FROM training_types WHERE subcategory = 'hiit' LIMIT 1;
|
||||||
|
SELECT id INTO v_yoga_id FROM training_types WHERE subcategory = 'yoga' LIMIT 1;
|
||||||
|
SELECT id INTO v_technique_id FROM training_types WHERE subcategory = 'technique' LIMIT 1;
|
||||||
|
SELECT id INTO v_sparring_id FROM training_types WHERE subcategory = 'sparring' LIMIT 1;
|
||||||
|
SELECT id INTO v_rowing_id FROM training_types WHERE subcategory = 'rowing' LIMIT 1;
|
||||||
|
SELECT id INTO v_dance_id FROM training_types WHERE subcategory = 'dance' LIMIT 1;
|
||||||
|
SELECT id INTO v_static_id FROM training_types WHERE subcategory = 'static' LIMIT 1;
|
||||||
|
SELECT id INTO v_regeneration_id FROM training_types WHERE subcategory = 'regeneration' LIMIT 1;
|
||||||
|
SELECT id INTO v_meditation_id FROM training_types WHERE subcategory = 'meditation' LIMIT 1;
|
||||||
|
SELECT id INTO v_mindfulness_id FROM training_types WHERE subcategory = 'mindfulness' LIMIT 1;
|
||||||
|
|
||||||
|
-- Insert default mappings (German Apple Health names)
|
||||||
|
INSERT INTO activity_type_mappings (activity_type, training_type_id, profile_id, source) VALUES
|
||||||
|
-- German workout types
|
||||||
|
('Laufen', v_running_id, NULL, 'default'),
|
||||||
|
('Gehen', v_walk_id, NULL, 'default'),
|
||||||
|
('Wandern', v_walk_id, NULL, 'default'),
|
||||||
|
('Outdoor Spaziergang', v_walk_id, NULL, 'default'),
|
||||||
|
('Innenräume Spaziergang', v_walk_id, NULL, 'default'),
|
||||||
|
('Spaziergang', v_walk_id, NULL, 'default'),
|
||||||
|
('Radfahren', v_cycling_id, NULL, 'default'),
|
||||||
|
('Schwimmen', v_swimming_id, NULL, 'default'),
|
||||||
|
('Traditionelles Krafttraining', v_hypertrophy_id, NULL, 'default'),
|
||||||
|
('Funktionelles Krafttraining', v_functional_id, NULL, 'default'),
|
||||||
|
('Hochintensives Intervalltraining', v_hiit_id, NULL, 'default'),
|
||||||
|
('Yoga', v_yoga_id, NULL, 'default'),
|
||||||
|
('Kampfsport', v_technique_id, NULL, 'default'),
|
||||||
|
('Matrial Arts', v_technique_id, NULL, 'default'), -- Common typo
|
||||||
|
('Boxen', v_sparring_id, NULL, 'default'),
|
||||||
|
('Rudern', v_rowing_id, NULL, 'default'),
|
||||||
|
('Tanzen', v_dance_id, NULL, 'default'),
|
||||||
|
('Cardio Dance', v_dance_id, NULL, 'default'),
|
||||||
|
('Flexibilität', v_static_id, NULL, 'default'),
|
||||||
|
('Abwärmen', v_regeneration_id, NULL, 'default'),
|
||||||
|
('Cooldown', v_regeneration_id, NULL, 'default'),
|
||||||
|
('Meditation', v_meditation_id, NULL, 'default'),
|
||||||
|
('Achtsamkeit', v_mindfulness_id, NULL, 'default'),
|
||||||
|
('Geist & Körper', v_yoga_id, NULL, 'default')
|
||||||
|
ON CONFLICT (activity_type, profile_id) DO NOTHING;
|
||||||
|
|
||||||
|
-- English workout types (for compatibility)
|
||||||
|
INSERT INTO activity_type_mappings (activity_type, training_type_id, profile_id, source) VALUES
|
||||||
|
('Running', v_running_id, NULL, 'default'),
|
||||||
|
('Walking', v_walk_id, NULL, 'default'),
|
||||||
|
('Hiking', v_walk_id, NULL, 'default'),
|
||||||
|
('Cycling', v_cycling_id, NULL, 'default'),
|
||||||
|
('Swimming', v_swimming_id, NULL, 'default'),
|
||||||
|
('Traditional Strength Training', v_hypertrophy_id, NULL, 'default'),
|
||||||
|
('Functional Strength Training', v_functional_id, NULL, 'default'),
|
||||||
|
('High Intensity Interval Training', v_hiit_id, NULL, 'default'),
|
||||||
|
('Martial Arts', v_technique_id, NULL, 'default'),
|
||||||
|
('Boxing', v_sparring_id, NULL, 'default'),
|
||||||
|
('Rowing', v_rowing_id, NULL, 'default'),
|
||||||
|
('Dance', v_dance_id, NULL, 'default'),
|
||||||
|
('Core Training', v_functional_id, NULL, 'default'),
|
||||||
|
('Flexibility', v_static_id, NULL, 'default'),
|
||||||
|
('Mindfulness', v_mindfulness_id, NULL, 'default')
|
||||||
|
ON CONFLICT (activity_type, profile_id) DO NOTHING;
|
||||||
|
END $$;
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- 4. Add comment
|
||||||
|
-- ========================================
|
||||||
|
COMMENT ON TABLE activity_type_mappings IS 'v9d Phase 1b: Learnable activity type to training type mappings. Replaces hardcoded mappings.';
|
||||||
59
backend/migrations/008_vitals_rest_days.sql
Normal file
59
backend/migrations/008_vitals_rest_days.sql
Normal file
|
|
@ -0,0 +1,59 @@
|
||||||
|
-- Migration 008: Vitals, Rest Days, Weekly Goals
|
||||||
|
-- v9d Phase 2: Sleep & Vitals Module
|
||||||
|
-- Date: 2026-03-22
|
||||||
|
|
||||||
|
-- Rest Days
|
||||||
|
CREATE TABLE IF NOT EXISTS rest_days (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
date DATE NOT NULL,
|
||||||
|
type VARCHAR(20) NOT NULL CHECK (type IN ('full_rest', 'active_recovery')),
|
||||||
|
note TEXT,
|
||||||
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
CONSTRAINT unique_rest_day_per_profile UNIQUE(profile_id, date)
|
||||||
|
);
|
||||||
|
CREATE INDEX idx_rest_days_profile_date ON rest_days(profile_id, date DESC);
|
||||||
|
|
||||||
|
-- Vitals (Resting HR + HRV)
|
||||||
|
CREATE TABLE IF NOT EXISTS vitals_log (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
date DATE NOT NULL,
|
||||||
|
resting_hr INTEGER CHECK (resting_hr > 0 AND resting_hr < 200),
|
||||||
|
hrv INTEGER CHECK (hrv > 0),
|
||||||
|
note TEXT,
|
||||||
|
source VARCHAR(20) DEFAULT 'manual' CHECK (source IN ('manual', 'apple_health', 'garmin')),
|
||||||
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
CONSTRAINT unique_vitals_per_day UNIQUE(profile_id, date)
|
||||||
|
);
|
||||||
|
CREATE INDEX idx_vitals_profile_date ON vitals_log(profile_id, date DESC);
|
||||||
|
|
||||||
|
-- Extend activity_log for heart rate data
|
||||||
|
ALTER TABLE activity_log
|
||||||
|
ADD COLUMN IF NOT EXISTS avg_hr INTEGER CHECK (avg_hr > 0 AND avg_hr < 250),
|
||||||
|
ADD COLUMN IF NOT EXISTS max_hr INTEGER CHECK (max_hr > 0 AND max_hr < 250);
|
||||||
|
|
||||||
|
-- Extend profiles for HF max and sleep goal
|
||||||
|
ALTER TABLE profiles
|
||||||
|
ADD COLUMN IF NOT EXISTS hf_max INTEGER CHECK (hf_max > 0 AND hf_max < 250),
|
||||||
|
ADD COLUMN IF NOT EXISTS sleep_goal_minutes INTEGER DEFAULT 450 CHECK (sleep_goal_minutes > 0);
|
||||||
|
|
||||||
|
-- Weekly Goals (Soll/Ist Wochenplanung)
|
||||||
|
CREATE TABLE IF NOT EXISTS weekly_goals (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
week_start DATE NOT NULL,
|
||||||
|
goals JSONB NOT NULL,
|
||||||
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
CONSTRAINT unique_weekly_goal_per_profile UNIQUE(profile_id, week_start)
|
||||||
|
);
|
||||||
|
CREATE INDEX idx_weekly_goals_profile_week ON weekly_goals(profile_id, week_start DESC);
|
||||||
|
|
||||||
|
-- Comments for documentation
|
||||||
|
COMMENT ON TABLE rest_days IS 'v9d Phase 2: Rest days tracking (full rest or active recovery)';
|
||||||
|
COMMENT ON TABLE vitals_log IS 'v9d Phase 2: Daily vitals (resting HR, HRV)';
|
||||||
|
COMMENT ON TABLE weekly_goals IS 'v9d Phase 2: Weekly training goals (Soll/Ist planning)';
|
||||||
|
COMMENT ON COLUMN profiles.hf_max IS 'Maximum heart rate for HR zone calculation';
|
||||||
|
COMMENT ON COLUMN profiles.sleep_goal_minutes IS 'Sleep goal in minutes (default: 450 = 7h 30min)';
|
||||||
31
backend/migrations/009_sleep_log.sql
Normal file
31
backend/migrations/009_sleep_log.sql
Normal file
|
|
@ -0,0 +1,31 @@
|
||||||
|
-- Migration 009: Sleep Log Table
|
||||||
|
-- v9d Phase 2b: Sleep Module Core
|
||||||
|
-- Date: 2026-03-22
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS sleep_log (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
date DATE NOT NULL,
|
||||||
|
bedtime TIME,
|
||||||
|
wake_time TIME,
|
||||||
|
duration_minutes INTEGER NOT NULL CHECK (duration_minutes > 0),
|
||||||
|
quality INTEGER CHECK (quality >= 1 AND quality <= 5),
|
||||||
|
wake_count INTEGER CHECK (wake_count >= 0),
|
||||||
|
deep_minutes INTEGER CHECK (deep_minutes >= 0),
|
||||||
|
rem_minutes INTEGER CHECK (rem_minutes >= 0),
|
||||||
|
light_minutes INTEGER CHECK (light_minutes >= 0),
|
||||||
|
awake_minutes INTEGER CHECK (awake_minutes >= 0),
|
||||||
|
sleep_segments JSONB,
|
||||||
|
note TEXT,
|
||||||
|
source VARCHAR(20) DEFAULT 'manual' CHECK (source IN ('manual', 'apple_health', 'garmin')),
|
||||||
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
CONSTRAINT unique_sleep_per_day UNIQUE(profile_id, date)
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_sleep_profile_date ON sleep_log(profile_id, date DESC);
|
||||||
|
|
||||||
|
-- Comments for documentation
|
||||||
|
COMMENT ON TABLE sleep_log IS 'v9d Phase 2b: Daily sleep tracking with phase data';
|
||||||
|
COMMENT ON COLUMN sleep_log.date IS 'Date of the night (wake date, not bedtime date)';
|
||||||
|
COMMENT ON COLUMN sleep_log.sleep_segments IS 'Raw phase segments: [{"phase": "deep", "start": "23:44", "duration_min": 42}, ...]';
|
||||||
62
backend/migrations/010_rest_days_jsonb.sql
Normal file
62
backend/migrations/010_rest_days_jsonb.sql
Normal file
|
|
@ -0,0 +1,62 @@
|
||||||
|
-- Migration 010: Rest Days Refactoring zu JSONB
|
||||||
|
-- v9d Phase 2a: Flexible, context-specific rest days
|
||||||
|
-- Date: 2026-03-22
|
||||||
|
|
||||||
|
-- Refactor rest_days to JSONB config for flexible rest day types
|
||||||
|
-- OLD: type VARCHAR(20) CHECK (type IN ('full_rest', 'active_recovery'))
|
||||||
|
-- NEW: rest_config JSONB with {focus, rest_from[], allows[], intensity_max}
|
||||||
|
|
||||||
|
-- Drop old type column
|
||||||
|
ALTER TABLE rest_days
|
||||||
|
DROP COLUMN IF EXISTS type;
|
||||||
|
|
||||||
|
-- Add new JSONB config column
|
||||||
|
ALTER TABLE rest_days
|
||||||
|
ADD COLUMN IF NOT EXISTS rest_config JSONB NOT NULL DEFAULT '{"focus": "mental_rest", "rest_from": [], "allows": []}'::jsonb;
|
||||||
|
|
||||||
|
-- Validation function for rest_config
|
||||||
|
CREATE OR REPLACE FUNCTION validate_rest_config(config JSONB) RETURNS BOOLEAN AS $$
|
||||||
|
BEGIN
|
||||||
|
-- Must have focus field
|
||||||
|
IF NOT (config ? 'focus') THEN
|
||||||
|
RETURN FALSE;
|
||||||
|
END IF;
|
||||||
|
|
||||||
|
-- focus must be one of the allowed values
|
||||||
|
IF NOT (config->>'focus' IN ('muscle_recovery', 'cardio_recovery', 'mental_rest', 'deload', 'injury')) THEN
|
||||||
|
RETURN FALSE;
|
||||||
|
END IF;
|
||||||
|
|
||||||
|
-- rest_from must be array if present
|
||||||
|
IF (config ? 'rest_from') AND jsonb_typeof(config->'rest_from') != 'array' THEN
|
||||||
|
RETURN FALSE;
|
||||||
|
END IF;
|
||||||
|
|
||||||
|
-- allows must be array if present
|
||||||
|
IF (config ? 'allows') AND jsonb_typeof(config->'allows') != 'array' THEN
|
||||||
|
RETURN FALSE;
|
||||||
|
END IF;
|
||||||
|
|
||||||
|
-- intensity_max must be number between 1-100 if present
|
||||||
|
IF (config ? 'intensity_max') AND (
|
||||||
|
jsonb_typeof(config->'intensity_max') != 'number' OR
|
||||||
|
(config->>'intensity_max')::int < 1 OR
|
||||||
|
(config->>'intensity_max')::int > 100
|
||||||
|
) THEN
|
||||||
|
RETURN FALSE;
|
||||||
|
END IF;
|
||||||
|
|
||||||
|
RETURN TRUE;
|
||||||
|
END;
|
||||||
|
$$ LANGUAGE plpgsql;
|
||||||
|
|
||||||
|
-- Add check constraint
|
||||||
|
ALTER TABLE rest_days
|
||||||
|
ADD CONSTRAINT valid_rest_config CHECK (validate_rest_config(rest_config));
|
||||||
|
|
||||||
|
-- Add comment for documentation
|
||||||
|
COMMENT ON COLUMN rest_days.rest_config IS 'JSONB: {focus: string, rest_from: string[], allows: string[], intensity_max?: number (1-100), note?: string}';
|
||||||
|
COMMENT ON TABLE rest_days IS 'v9d Phase 2a: Context-specific rest days (strength rest but cardio allowed, etc.)';
|
||||||
|
|
||||||
|
-- Create GIN index on rest_config for faster JSONB queries
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_rest_days_config ON rest_days USING GIN (rest_config);
|
||||||
17
backend/migrations/011_allow_multiple_rest_days_per_date.sql
Normal file
17
backend/migrations/011_allow_multiple_rest_days_per_date.sql
Normal file
|
|
@ -0,0 +1,17 @@
|
||||||
|
-- Migration 011: Allow Multiple Rest Days per Date
|
||||||
|
-- v9d Phase 2a: Support for multi-dimensional rest (development routes)
|
||||||
|
-- Date: 2026-03-22
|
||||||
|
|
||||||
|
-- Remove UNIQUE constraint to allow multiple rest day types per date
|
||||||
|
-- Use Case: Muscle recovery + Mental rest on same day
|
||||||
|
-- Future: Development routes (Conditioning, Strength, Coordination, Mental, Mobility, Technique)
|
||||||
|
|
||||||
|
ALTER TABLE rest_days
|
||||||
|
DROP CONSTRAINT IF EXISTS unique_rest_day_per_profile;
|
||||||
|
|
||||||
|
-- Add index for efficient queries (profile_id, date)
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_rest_days_profile_date_multi
|
||||||
|
ON rest_days(profile_id, date DESC);
|
||||||
|
|
||||||
|
-- Comment for documentation
|
||||||
|
COMMENT ON TABLE rest_days IS 'v9d Phase 2a: Multi-dimensional rest days - multiple entries per date allowed for different development routes (muscle, cardio, mental, coordination, technique)';
|
||||||
34
backend/migrations/012_rest_days_unique_focus.sql
Normal file
34
backend/migrations/012_rest_days_unique_focus.sql
Normal file
|
|
@ -0,0 +1,34 @@
|
||||||
|
-- Migration 012: Unique constraint on (profile_id, date, focus)
|
||||||
|
-- v9d Phase 2a: Prevent duplicate rest day types per date
|
||||||
|
-- Date: 2026-03-22
|
||||||
|
|
||||||
|
-- Add focus column (extracted from rest_config for performance + constraints)
|
||||||
|
ALTER TABLE rest_days
|
||||||
|
ADD COLUMN IF NOT EXISTS focus VARCHAR(20);
|
||||||
|
|
||||||
|
-- Populate from existing JSONB data
|
||||||
|
UPDATE rest_days
|
||||||
|
SET focus = rest_config->>'focus'
|
||||||
|
WHERE focus IS NULL;
|
||||||
|
|
||||||
|
-- Make NOT NULL (safe because we just populated all rows)
|
||||||
|
ALTER TABLE rest_days
|
||||||
|
ALTER COLUMN focus SET NOT NULL;
|
||||||
|
|
||||||
|
-- Add CHECK constraint for valid focus values
|
||||||
|
ALTER TABLE rest_days
|
||||||
|
ADD CONSTRAINT valid_focus CHECK (
|
||||||
|
focus IN ('muscle_recovery', 'cardio_recovery', 'mental_rest', 'deload', 'injury')
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Add UNIQUE constraint: Same profile + date + focus = duplicate
|
||||||
|
ALTER TABLE rest_days
|
||||||
|
ADD CONSTRAINT unique_rest_day_per_focus
|
||||||
|
UNIQUE (profile_id, date, focus);
|
||||||
|
|
||||||
|
-- Add index for efficient queries by focus
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_rest_days_focus
|
||||||
|
ON rest_days(focus);
|
||||||
|
|
||||||
|
-- Comment for documentation
|
||||||
|
COMMENT ON COLUMN rest_days.focus IS 'Extracted from rest_config.focus for performance and constraints. Prevents duplicate rest day types per date.';
|
||||||
145
backend/migrations/013_training_parameters.sql
Normal file
145
backend/migrations/013_training_parameters.sql
Normal file
|
|
@ -0,0 +1,145 @@
|
||||||
|
-- Migration 013: Training Parameters Registry
|
||||||
|
-- Training Type Profiles System - Foundation
|
||||||
|
-- Date: 2026-03-23
|
||||||
|
-- Issue: #15
|
||||||
|
|
||||||
|
-- ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
-- TRAINING PARAMETERS REGISTRY
|
||||||
|
-- Zentrale Definition aller messbaren Parameter für Aktivitäten
|
||||||
|
-- ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS training_parameters (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
key VARCHAR(50) UNIQUE NOT NULL,
|
||||||
|
name_de VARCHAR(100) NOT NULL,
|
||||||
|
name_en VARCHAR(100) NOT NULL,
|
||||||
|
category VARCHAR(50) NOT NULL,
|
||||||
|
data_type VARCHAR(20) NOT NULL,
|
||||||
|
unit VARCHAR(20),
|
||||||
|
description_de TEXT,
|
||||||
|
description_en TEXT,
|
||||||
|
source_field VARCHAR(100),
|
||||||
|
validation_rules JSONB DEFAULT '{}'::jsonb,
|
||||||
|
is_active BOOLEAN DEFAULT true,
|
||||||
|
created_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
|
||||||
|
CONSTRAINT chk_category CHECK (category IN (
|
||||||
|
'physical', 'physiological', 'subjective', 'environmental', 'performance'
|
||||||
|
)),
|
||||||
|
CONSTRAINT chk_data_type CHECK (data_type IN (
|
||||||
|
'integer', 'float', 'string', 'boolean'
|
||||||
|
))
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_training_parameters_category ON training_parameters(category) WHERE is_active = true;
|
||||||
|
CREATE INDEX idx_training_parameters_key ON training_parameters(key) WHERE is_active = true;
|
||||||
|
|
||||||
|
COMMENT ON TABLE training_parameters IS 'Registry of all measurable activity parameters (Training Type Profiles System)';
|
||||||
|
COMMENT ON COLUMN training_parameters.key IS 'Unique identifier (e.g. "avg_hr", "duration_min")';
|
||||||
|
COMMENT ON COLUMN training_parameters.category IS 'Parameter category: physical, physiological, subjective, environmental, performance';
|
||||||
|
COMMENT ON COLUMN training_parameters.data_type IS 'Data type: integer, float, string, boolean';
|
||||||
|
COMMENT ON COLUMN training_parameters.source_field IS 'Mapping to activity_log column name';
|
||||||
|
COMMENT ON COLUMN training_parameters.validation_rules IS 'Min/Max/Enum for validation (JSONB)';
|
||||||
|
|
||||||
|
-- ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
-- STANDARD PARAMETERS
|
||||||
|
-- ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
INSERT INTO training_parameters (key, name_de, name_en, category, data_type, unit, source_field, validation_rules, description_de, description_en) VALUES
|
||||||
|
|
||||||
|
-- Physical Parameters
|
||||||
|
('duration_min', 'Dauer', 'Duration', 'physical', 'integer', 'min', 'duration_min',
|
||||||
|
'{"min": 0, "max": 600}'::jsonb,
|
||||||
|
'Trainingsdauer in Minuten',
|
||||||
|
'Training duration in minutes'),
|
||||||
|
|
||||||
|
('distance_km', 'Distanz', 'Distance', 'physical', 'float', 'km', 'distance_km',
|
||||||
|
'{"min": 0, "max": 200}'::jsonb,
|
||||||
|
'Zurückgelegte Distanz in Kilometern',
|
||||||
|
'Distance covered in kilometers'),
|
||||||
|
|
||||||
|
('kcal_active', 'Aktive Kalorien', 'Active Calories', 'physical', 'integer', 'kcal', 'kcal_active',
|
||||||
|
'{"min": 0, "max": 5000}'::jsonb,
|
||||||
|
'Aktiver Kalorienverbrauch',
|
||||||
|
'Active calorie burn'),
|
||||||
|
|
||||||
|
('kcal_resting', 'Ruhekalorien', 'Resting Calories', 'physical', 'integer', 'kcal', 'kcal_resting',
|
||||||
|
'{"min": 0, "max": 2000}'::jsonb,
|
||||||
|
'Ruheumsatz während Training',
|
||||||
|
'Resting calorie burn during training'),
|
||||||
|
|
||||||
|
('elevation_gain', 'Höhenmeter', 'Elevation Gain', 'physical', 'integer', 'm', 'elevation_gain',
|
||||||
|
'{"min": 0, "max": 5000}'::jsonb,
|
||||||
|
'Überwundene Höhenmeter',
|
||||||
|
'Elevation gain in meters'),
|
||||||
|
|
||||||
|
('pace_min_per_km', 'Pace', 'Pace', 'physical', 'float', 'min/km', 'pace_min_per_km',
|
||||||
|
'{"min": 2, "max": 20}'::jsonb,
|
||||||
|
'Durchschnittstempo in Minuten pro Kilometer',
|
||||||
|
'Average pace in minutes per kilometer'),
|
||||||
|
|
||||||
|
('cadence', 'Trittfrequenz', 'Cadence', 'physical', 'integer', 'spm', 'cadence',
|
||||||
|
'{"min": 0, "max": 220}'::jsonb,
|
||||||
|
'Schrittfrequenz (Schritte pro Minute)',
|
||||||
|
'Step frequency (steps per minute)'),
|
||||||
|
|
||||||
|
-- Physiological Parameters
|
||||||
|
('avg_hr', 'Durchschnittspuls', 'Average Heart Rate', 'physiological', 'integer', 'bpm', 'hr_avg',
|
||||||
|
'{"min": 30, "max": 220}'::jsonb,
|
||||||
|
'Durchschnittliche Herzfrequenz',
|
||||||
|
'Average heart rate'),
|
||||||
|
|
||||||
|
('max_hr', 'Maximalpuls', 'Max Heart Rate', 'physiological', 'integer', 'bpm', 'hr_max',
|
||||||
|
'{"min": 40, "max": 220}'::jsonb,
|
||||||
|
'Maximale Herzfrequenz',
|
||||||
|
'Maximum heart rate'),
|
||||||
|
|
||||||
|
('min_hr', 'Minimalpuls', 'Min Heart Rate', 'physiological', 'integer', 'bpm', 'hr_min',
|
||||||
|
'{"min": 30, "max": 200}'::jsonb,
|
||||||
|
'Minimale Herzfrequenz',
|
||||||
|
'Minimum heart rate'),
|
||||||
|
|
||||||
|
('avg_power', 'Durchschnittsleistung', 'Average Power', 'physiological', 'integer', 'W', 'avg_power',
|
||||||
|
'{"min": 0, "max": 1000}'::jsonb,
|
||||||
|
'Durchschnittliche Leistung in Watt',
|
||||||
|
'Average power output in watts'),
|
||||||
|
|
||||||
|
-- Subjective Parameters
|
||||||
|
('rpe', 'RPE (Anstrengung)', 'RPE (Perceived Exertion)', 'subjective', 'integer', 'scale', 'rpe',
|
||||||
|
'{"min": 1, "max": 10}'::jsonb,
|
||||||
|
'Subjektive Anstrengung (Rate of Perceived Exertion)',
|
||||||
|
'Rate of Perceived Exertion'),
|
||||||
|
|
||||||
|
-- Environmental Parameters
|
||||||
|
('temperature_celsius', 'Temperatur', 'Temperature', 'environmental', 'float', '°C', 'temperature_celsius',
|
||||||
|
'{"min": -30, "max": 50}'::jsonb,
|
||||||
|
'Umgebungstemperatur in Celsius',
|
||||||
|
'Ambient temperature in Celsius'),
|
||||||
|
|
||||||
|
('humidity_percent', 'Luftfeuchtigkeit', 'Humidity', 'environmental', 'integer', '%', 'humidity_percent',
|
||||||
|
'{"min": 0, "max": 100}'::jsonb,
|
||||||
|
'Relative Luftfeuchtigkeit in Prozent',
|
||||||
|
'Relative humidity in percent'),
|
||||||
|
|
||||||
|
-- Performance Parameters (calculated)
|
||||||
|
('avg_hr_percent', '% Max-HF', '% Max HR', 'performance', 'float', '%', 'avg_hr_percent',
|
||||||
|
'{"min": 0, "max": 100}'::jsonb,
|
||||||
|
'Durchschnittspuls als Prozent der maximalen Herzfrequenz',
|
||||||
|
'Average heart rate as percentage of max heart rate'),
|
||||||
|
|
||||||
|
('kcal_per_km', 'Kalorien pro km', 'Calories per km', 'performance', 'float', 'kcal/km', 'kcal_per_km',
|
||||||
|
'{"min": 0, "max": 1000}'::jsonb,
|
||||||
|
'Kalorienverbrauch pro Kilometer',
|
||||||
|
'Calorie burn per kilometer');
|
||||||
|
|
||||||
|
-- ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
-- SUMMARY
|
||||||
|
-- ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
-- Display inserted parameters
|
||||||
|
DO $$
|
||||||
|
BEGIN
|
||||||
|
RAISE NOTICE '✓ Migration 013 completed';
|
||||||
|
RAISE NOTICE ' - Created training_parameters table';
|
||||||
|
RAISE NOTICE ' - Inserted % standard parameters', (SELECT COUNT(*) FROM training_parameters);
|
||||||
|
END $$;
|
||||||
114
backend/migrations/014_training_profiles.sql
Normal file
114
backend/migrations/014_training_profiles.sql
Normal file
|
|
@ -0,0 +1,114 @@
|
||||||
|
-- Migration 014: Training Type Profiles & Activity Evaluation
|
||||||
|
-- Training Type Profiles System - Schema Extensions
|
||||||
|
-- Date: 2026-03-23
|
||||||
|
-- Issue: #15
|
||||||
|
|
||||||
|
-- ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
-- EXTEND TRAINING TYPES
|
||||||
|
-- Add profile column for comprehensive training type configuration
|
||||||
|
-- ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
ALTER TABLE training_types ADD COLUMN IF NOT EXISTS profile JSONB DEFAULT NULL;
|
||||||
|
|
||||||
|
CREATE INDEX idx_training_types_profile_enabled ON training_types
|
||||||
|
((profile->'rule_sets'->'minimum_requirements'->>'enabled'))
|
||||||
|
WHERE profile IS NOT NULL;
|
||||||
|
|
||||||
|
COMMENT ON COLUMN training_types.profile IS 'Comprehensive training type profile with 7 dimensions (rule_sets, intensity_zones, training_effects, periodization, performance_indicators, safety, ai_context)';
|
||||||
|
|
||||||
|
-- ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
-- EXTEND ACTIVITY LOG
|
||||||
|
-- Add evaluation results and quality labels
|
||||||
|
-- ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
ALTER TABLE activity_log ADD COLUMN IF NOT EXISTS evaluation JSONB DEFAULT NULL;
|
||||||
|
ALTER TABLE activity_log ADD COLUMN IF NOT EXISTS quality_label VARCHAR(20);
|
||||||
|
ALTER TABLE activity_log ADD COLUMN IF NOT EXISTS overall_score FLOAT;
|
||||||
|
|
||||||
|
CREATE INDEX idx_activity_quality_label ON activity_log(quality_label)
|
||||||
|
WHERE quality_label IS NOT NULL;
|
||||||
|
|
||||||
|
CREATE INDEX idx_activity_overall_score ON activity_log(overall_score DESC)
|
||||||
|
WHERE overall_score IS NOT NULL;
|
||||||
|
|
||||||
|
CREATE INDEX idx_activity_evaluation_passed ON activity_log
|
||||||
|
((evaluation->'rule_set_results'->'minimum_requirements'->>'passed'))
|
||||||
|
WHERE evaluation IS NOT NULL;
|
||||||
|
|
||||||
|
COMMENT ON COLUMN activity_log.evaluation IS 'Complete evaluation result (7 dimensions, scores, recommendations, warnings)';
|
||||||
|
COMMENT ON COLUMN activity_log.quality_label IS 'Quality label: excellent, good, acceptable, poor (for quick filtering)';
|
||||||
|
COMMENT ON COLUMN activity_log.overall_score IS 'Overall quality score 0.0-1.0 (for sorting)';
|
||||||
|
|
||||||
|
-- ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
-- ADD MISSING COLUMNS (if not already added by previous migrations)
|
||||||
|
-- ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
-- Add HR columns if not exist (might be in Migration 008)
|
||||||
|
DO $$
|
||||||
|
BEGIN
|
||||||
|
IF NOT EXISTS (SELECT 1 FROM information_schema.columns
|
||||||
|
WHERE table_name='activity_log' AND column_name='hr_min') THEN
|
||||||
|
ALTER TABLE activity_log ADD COLUMN hr_min INTEGER CHECK (hr_min > 0 AND hr_min < 200);
|
||||||
|
END IF;
|
||||||
|
END $$;
|
||||||
|
|
||||||
|
-- Add performance columns for calculated values
|
||||||
|
ALTER TABLE activity_log ADD COLUMN IF NOT EXISTS avg_hr_percent FLOAT;
|
||||||
|
ALTER TABLE activity_log ADD COLUMN IF NOT EXISTS kcal_per_km FLOAT;
|
||||||
|
ALTER TABLE activity_log ADD COLUMN IF NOT EXISTS pace_min_per_km FLOAT;
|
||||||
|
ALTER TABLE activity_log ADD COLUMN IF NOT EXISTS cadence INTEGER;
|
||||||
|
ALTER TABLE activity_log ADD COLUMN IF NOT EXISTS avg_power INTEGER;
|
||||||
|
ALTER TABLE activity_log ADD COLUMN IF NOT EXISTS elevation_gain INTEGER;
|
||||||
|
ALTER TABLE activity_log ADD COLUMN IF NOT EXISTS temperature_celsius FLOAT;
|
||||||
|
ALTER TABLE activity_log ADD COLUMN IF NOT EXISTS humidity_percent INTEGER;
|
||||||
|
|
||||||
|
COMMENT ON COLUMN activity_log.avg_hr_percent IS 'Average HR as percentage of user max HR (calculated)';
|
||||||
|
COMMENT ON COLUMN activity_log.kcal_per_km IS 'Calories burned per kilometer (calculated)';
|
||||||
|
|
||||||
|
-- ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
-- HELPER FUNCTION: Calculate avg_hr_percent
|
||||||
|
-- ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
CREATE OR REPLACE FUNCTION calculate_avg_hr_percent()
|
||||||
|
RETURNS TRIGGER AS $$
|
||||||
|
DECLARE
|
||||||
|
user_max_hr INTEGER;
|
||||||
|
BEGIN
|
||||||
|
-- Get user's max HR from profile
|
||||||
|
SELECT hf_max INTO user_max_hr
|
||||||
|
FROM profiles
|
||||||
|
WHERE id = NEW.profile_id;
|
||||||
|
|
||||||
|
-- Calculate percentage if both values exist
|
||||||
|
IF NEW.hr_avg IS NOT NULL AND user_max_hr IS NOT NULL AND user_max_hr > 0 THEN
|
||||||
|
NEW.avg_hr_percent := (NEW.hr_avg::float / user_max_hr::float) * 100;
|
||||||
|
END IF;
|
||||||
|
|
||||||
|
-- Calculate kcal per km
|
||||||
|
IF NEW.kcal_active IS NOT NULL AND NEW.distance_km IS NOT NULL AND NEW.distance_km > 0 THEN
|
||||||
|
NEW.kcal_per_km := NEW.kcal_active::float / NEW.distance_km;
|
||||||
|
END IF;
|
||||||
|
|
||||||
|
RETURN NEW;
|
||||||
|
END;
|
||||||
|
$$ LANGUAGE plpgsql;
|
||||||
|
|
||||||
|
-- Trigger for automatic calculation
|
||||||
|
DROP TRIGGER IF EXISTS trg_calculate_performance_metrics ON activity_log;
|
||||||
|
CREATE TRIGGER trg_calculate_performance_metrics
|
||||||
|
BEFORE INSERT OR UPDATE ON activity_log
|
||||||
|
FOR EACH ROW
|
||||||
|
EXECUTE FUNCTION calculate_avg_hr_percent();
|
||||||
|
|
||||||
|
-- ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
-- SUMMARY
|
||||||
|
-- ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
DO $$
|
||||||
|
BEGIN
|
||||||
|
RAISE NOTICE '✓ Migration 014 completed';
|
||||||
|
RAISE NOTICE ' - Extended training_types with profile column';
|
||||||
|
RAISE NOTICE ' - Extended activity_log with evaluation columns';
|
||||||
|
RAISE NOTICE ' - Added performance metric calculations';
|
||||||
|
RAISE NOTICE ' - Created indexes for fast queries';
|
||||||
|
END $$;
|
||||||
29
backend/migrations/014_vitals_extended.sql
Normal file
29
backend/migrations/014_vitals_extended.sql
Normal file
|
|
@ -0,0 +1,29 @@
|
||||||
|
-- Migration 014: Extended Vitals (Blood Pressure, VO2 Max, SpO2, Respiratory Rate)
|
||||||
|
-- v9d Phase 2d: Complete vitals tracking
|
||||||
|
-- Date: 2026-03-23
|
||||||
|
|
||||||
|
-- Add new vitals fields
|
||||||
|
ALTER TABLE vitals_log
|
||||||
|
ADD COLUMN IF NOT EXISTS blood_pressure_systolic INTEGER CHECK (blood_pressure_systolic > 0 AND blood_pressure_systolic < 300),
|
||||||
|
ADD COLUMN IF NOT EXISTS blood_pressure_diastolic INTEGER CHECK (blood_pressure_diastolic > 0 AND blood_pressure_diastolic < 200),
|
||||||
|
ADD COLUMN IF NOT EXISTS pulse INTEGER CHECK (pulse > 0 AND pulse < 250),
|
||||||
|
ADD COLUMN IF NOT EXISTS vo2_max DECIMAL(4,1) CHECK (vo2_max > 0 AND vo2_max < 100),
|
||||||
|
ADD COLUMN IF NOT EXISTS spo2 INTEGER CHECK (spo2 >= 70 AND spo2 <= 100),
|
||||||
|
ADD COLUMN IF NOT EXISTS respiratory_rate DECIMAL(4,1) CHECK (respiratory_rate > 0 AND respiratory_rate < 60),
|
||||||
|
ADD COLUMN IF NOT EXISTS irregular_heartbeat BOOLEAN DEFAULT false,
|
||||||
|
ADD COLUMN IF NOT EXISTS possible_afib BOOLEAN DEFAULT false;
|
||||||
|
|
||||||
|
-- Update source check to include omron
|
||||||
|
ALTER TABLE vitals_log DROP CONSTRAINT IF EXISTS vitals_log_source_check;
|
||||||
|
ALTER TABLE vitals_log ADD CONSTRAINT vitals_log_source_check
|
||||||
|
CHECK (source IN ('manual', 'apple_health', 'garmin', 'omron'));
|
||||||
|
|
||||||
|
-- Comments
|
||||||
|
COMMENT ON COLUMN vitals_log.blood_pressure_systolic IS 'Systolic blood pressure (mmHg) from Omron or manual entry';
|
||||||
|
COMMENT ON COLUMN vitals_log.blood_pressure_diastolic IS 'Diastolic blood pressure (mmHg) from Omron or manual entry';
|
||||||
|
COMMENT ON COLUMN vitals_log.pulse IS 'Pulse during blood pressure measurement (bpm)';
|
||||||
|
COMMENT ON COLUMN vitals_log.vo2_max IS 'VO2 Max from Apple Watch (ml/kg/min)';
|
||||||
|
COMMENT ON COLUMN vitals_log.spo2 IS 'Blood oxygen saturation (%) from Apple Watch';
|
||||||
|
COMMENT ON COLUMN vitals_log.respiratory_rate IS 'Respiratory rate (breaths/min) from Apple Watch';
|
||||||
|
COMMENT ON COLUMN vitals_log.irregular_heartbeat IS 'Irregular heartbeat detected (Omron)';
|
||||||
|
COMMENT ON COLUMN vitals_log.possible_afib IS 'Possible atrial fibrillation (Omron)';
|
||||||
184
backend/migrations/015_vitals_refactoring.sql
Normal file
184
backend/migrations/015_vitals_refactoring.sql
Normal file
|
|
@ -0,0 +1,184 @@
|
||||||
|
-- Migration 015: Vitals Refactoring - Trennung Baseline vs. Context-Dependent
|
||||||
|
-- v9d Phase 2d: Architektur-Verbesserung für bessere Datenqualität
|
||||||
|
-- Date: 2026-03-23
|
||||||
|
|
||||||
|
-- ══════════════════════════════════════════════════════════════════════════════
|
||||||
|
-- STEP 1: Create new tables
|
||||||
|
-- ══════════════════════════════════════════════════════════════════════════════
|
||||||
|
|
||||||
|
-- Baseline Vitals (slow-changing, once daily, morning measurement)
|
||||||
|
CREATE TABLE IF NOT EXISTS vitals_baseline (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
date DATE NOT NULL,
|
||||||
|
|
||||||
|
-- Core baseline vitals
|
||||||
|
resting_hr INTEGER CHECK (resting_hr > 0 AND resting_hr < 120),
|
||||||
|
hrv INTEGER CHECK (hrv > 0 AND hrv < 300),
|
||||||
|
vo2_max DECIMAL(4,1) CHECK (vo2_max > 0 AND vo2_max < 100),
|
||||||
|
spo2 INTEGER CHECK (spo2 >= 70 AND spo2 <= 100),
|
||||||
|
respiratory_rate DECIMAL(4,1) CHECK (respiratory_rate > 0 AND respiratory_rate < 60),
|
||||||
|
|
||||||
|
-- Future baseline vitals (prepared for expansion)
|
||||||
|
body_temperature DECIMAL(3,1) CHECK (body_temperature > 30 AND body_temperature < 45),
|
||||||
|
resting_metabolic_rate INTEGER CHECK (resting_metabolic_rate > 0),
|
||||||
|
|
||||||
|
-- Metadata
|
||||||
|
note TEXT,
|
||||||
|
source VARCHAR(20) DEFAULT 'manual' CHECK (source IN ('manual', 'apple_health', 'garmin', 'withings')),
|
||||||
|
created_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
updated_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
|
||||||
|
CONSTRAINT unique_baseline_per_day UNIQUE(profile_id, date)
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_vitals_baseline_profile_date ON vitals_baseline(profile_id, date DESC);
|
||||||
|
|
||||||
|
COMMENT ON TABLE vitals_baseline IS 'v9d Phase 2d: Baseline vitals measured once daily (morning, fasted)';
|
||||||
|
COMMENT ON COLUMN vitals_baseline.resting_hr IS 'Resting heart rate (bpm) - measured in the morning before getting up';
|
||||||
|
COMMENT ON COLUMN vitals_baseline.hrv IS 'Heart rate variability (ms) - higher is better';
|
||||||
|
COMMENT ON COLUMN vitals_baseline.vo2_max IS 'VO2 Max (ml/kg/min) - estimated by Apple Watch or lab test';
|
||||||
|
COMMENT ON COLUMN vitals_baseline.spo2 IS 'Blood oxygen saturation (%) - baseline measurement';
|
||||||
|
COMMENT ON COLUMN vitals_baseline.respiratory_rate IS 'Respiratory rate (breaths/min) - baseline measurement';
|
||||||
|
|
||||||
|
-- ══════════════════════════════════════════════════════════════════════════════
|
||||||
|
|
||||||
|
-- Blood Pressure Log (context-dependent, multiple times per day)
|
||||||
|
CREATE TABLE IF NOT EXISTS blood_pressure_log (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
measured_at TIMESTAMP NOT NULL,
|
||||||
|
|
||||||
|
-- Blood pressure measurements
|
||||||
|
systolic INTEGER NOT NULL CHECK (systolic > 0 AND systolic < 300),
|
||||||
|
diastolic INTEGER NOT NULL CHECK (diastolic > 0 AND diastolic < 200),
|
||||||
|
pulse INTEGER CHECK (pulse > 0 AND pulse < 250),
|
||||||
|
|
||||||
|
-- Context tagging for correlation analysis
|
||||||
|
context VARCHAR(30) CHECK (context IN (
|
||||||
|
'morning_fasted', -- Morgens nüchtern
|
||||||
|
'after_meal', -- Nach dem Essen
|
||||||
|
'before_training', -- Vor dem Training
|
||||||
|
'after_training', -- Nach dem Training
|
||||||
|
'evening', -- Abends
|
||||||
|
'stress', -- Bei Stress
|
||||||
|
'resting', -- Ruhemessung
|
||||||
|
'other' -- Sonstiges
|
||||||
|
)),
|
||||||
|
|
||||||
|
-- Warning flags (Omron)
|
||||||
|
irregular_heartbeat BOOLEAN DEFAULT false,
|
||||||
|
possible_afib BOOLEAN DEFAULT false,
|
||||||
|
|
||||||
|
-- Metadata
|
||||||
|
note TEXT,
|
||||||
|
source VARCHAR(20) DEFAULT 'manual' CHECK (source IN ('manual', 'omron', 'apple_health', 'withings')),
|
||||||
|
created_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
|
||||||
|
CONSTRAINT unique_bp_measurement UNIQUE(profile_id, measured_at)
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_blood_pressure_profile_datetime ON blood_pressure_log(profile_id, measured_at DESC);
|
||||||
|
CREATE INDEX idx_blood_pressure_context ON blood_pressure_log(context) WHERE context IS NOT NULL;
|
||||||
|
|
||||||
|
COMMENT ON TABLE blood_pressure_log IS 'v9d Phase 2d: Blood pressure measurements (multiple per day, context-aware)';
|
||||||
|
COMMENT ON COLUMN blood_pressure_log.context IS 'Measurement context for correlation analysis';
|
||||||
|
COMMENT ON COLUMN blood_pressure_log.irregular_heartbeat IS 'Irregular heartbeat detected (Omron device)';
|
||||||
|
COMMENT ON COLUMN blood_pressure_log.possible_afib IS 'Possible atrial fibrillation (Omron device)';
|
||||||
|
|
||||||
|
-- ══════════════════════════════════════════════════════════════════════════════
|
||||||
|
-- STEP 2: Migrate existing data from vitals_log
|
||||||
|
-- ══════════════════════════════════════════════════════════════════════════════
|
||||||
|
|
||||||
|
-- Migrate baseline vitals (RHR, HRV, VO2 Max, SpO2, Respiratory Rate)
|
||||||
|
INSERT INTO vitals_baseline (
|
||||||
|
profile_id, date,
|
||||||
|
resting_hr, hrv, vo2_max, spo2, respiratory_rate,
|
||||||
|
note, source, created_at, updated_at
|
||||||
|
)
|
||||||
|
SELECT
|
||||||
|
profile_id, date,
|
||||||
|
resting_hr, hrv, vo2_max, spo2, respiratory_rate,
|
||||||
|
note, source, created_at, updated_at
|
||||||
|
FROM vitals_log
|
||||||
|
WHERE resting_hr IS NOT NULL
|
||||||
|
OR hrv IS NOT NULL
|
||||||
|
OR vo2_max IS NOT NULL
|
||||||
|
OR spo2 IS NOT NULL
|
||||||
|
OR respiratory_rate IS NOT NULL
|
||||||
|
ON CONFLICT (profile_id, date) DO NOTHING;
|
||||||
|
|
||||||
|
-- Migrate blood pressure measurements
|
||||||
|
-- Note: Use date + 08:00 as default timestamp (morning measurement)
|
||||||
|
INSERT INTO blood_pressure_log (
|
||||||
|
profile_id, measured_at,
|
||||||
|
systolic, diastolic, pulse,
|
||||||
|
irregular_heartbeat, possible_afib,
|
||||||
|
note, source, created_at
|
||||||
|
)
|
||||||
|
SELECT
|
||||||
|
profile_id,
|
||||||
|
(date + TIME '08:00:00')::timestamp AS measured_at,
|
||||||
|
blood_pressure_systolic,
|
||||||
|
blood_pressure_diastolic,
|
||||||
|
pulse,
|
||||||
|
irregular_heartbeat,
|
||||||
|
possible_afib,
|
||||||
|
note,
|
||||||
|
CASE
|
||||||
|
WHEN source = 'manual' THEN 'manual'
|
||||||
|
WHEN source = 'omron' THEN 'omron'
|
||||||
|
ELSE 'manual'
|
||||||
|
END AS source,
|
||||||
|
created_at
|
||||||
|
FROM vitals_log
|
||||||
|
WHERE blood_pressure_systolic IS NOT NULL
|
||||||
|
AND blood_pressure_diastolic IS NOT NULL
|
||||||
|
ON CONFLICT (profile_id, measured_at) DO NOTHING;
|
||||||
|
|
||||||
|
-- ══════════════════════════════════════════════════════════════════════════════
|
||||||
|
-- STEP 3: Drop old vitals_log table (backup first)
|
||||||
|
-- ══════════════════════════════════════════════════════════════════════════════
|
||||||
|
|
||||||
|
-- Rename old table as backup (keep for safety, can be dropped later)
|
||||||
|
ALTER TABLE vitals_log RENAME TO vitals_log_backup_pre_015;
|
||||||
|
|
||||||
|
-- Drop old index (it's on the renamed table now)
|
||||||
|
DROP INDEX IF EXISTS idx_vitals_profile_date;
|
||||||
|
|
||||||
|
-- ══════════════════════════════════════════════════════════════════════════════
|
||||||
|
-- STEP 4: Prepared for future vitals types
|
||||||
|
-- ══════════════════════════════════════════════════════════════════════════════
|
||||||
|
|
||||||
|
-- Future tables (commented out, create when needed):
|
||||||
|
|
||||||
|
-- Glucose Log (for blood sugar tracking)
|
||||||
|
-- CREATE TABLE glucose_log (
|
||||||
|
-- id SERIAL PRIMARY KEY,
|
||||||
|
-- profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
-- measured_at TIMESTAMP NOT NULL,
|
||||||
|
-- glucose_mg_dl INTEGER NOT NULL CHECK (glucose_mg_dl > 0 AND glucose_mg_dl < 500),
|
||||||
|
-- context VARCHAR(30) CHECK (context IN (
|
||||||
|
-- 'fasted', 'before_meal', 'after_meal_1h', 'after_meal_2h', 'before_training', 'after_training', 'other'
|
||||||
|
-- )),
|
||||||
|
-- note TEXT,
|
||||||
|
-- source VARCHAR(20) DEFAULT 'manual',
|
||||||
|
-- created_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
-- CONSTRAINT unique_glucose_measurement UNIQUE(profile_id, measured_at)
|
||||||
|
-- );
|
||||||
|
|
||||||
|
-- Temperature Log (for illness tracking)
|
||||||
|
-- CREATE TABLE temperature_log (
|
||||||
|
-- id SERIAL PRIMARY KEY,
|
||||||
|
-- profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
-- measured_at TIMESTAMP NOT NULL,
|
||||||
|
-- temperature_celsius DECIMAL(3,1) NOT NULL CHECK (temperature_celsius > 30 AND temperature_celsius < 45),
|
||||||
|
-- measurement_location VARCHAR(20) CHECK (measurement_location IN ('oral', 'ear', 'forehead', 'armpit')),
|
||||||
|
-- note TEXT,
|
||||||
|
-- created_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
-- CONSTRAINT unique_temperature_measurement UNIQUE(profile_id, measured_at)
|
||||||
|
-- );
|
||||||
|
|
||||||
|
-- ══════════════════════════════════════════════════════════════════════════════
|
||||||
|
-- Migration complete
|
||||||
|
-- ══════════════════════════════════════════════════════════════════════════════
|
||||||
21
backend/migrations/016_global_quality_filter.sql
Normal file
21
backend/migrations/016_global_quality_filter.sql
Normal file
|
|
@ -0,0 +1,21 @@
|
||||||
|
-- Migration 016: Global Quality Filter Setting
|
||||||
|
-- Issue: #31
|
||||||
|
-- Date: 2026-03-23
|
||||||
|
-- Description: Add quality_filter_level to profiles for consistent data views
|
||||||
|
|
||||||
|
-- Add quality_filter_level column to profiles
|
||||||
|
ALTER TABLE profiles ADD COLUMN IF NOT EXISTS quality_filter_level VARCHAR(20) DEFAULT 'all';
|
||||||
|
|
||||||
|
COMMENT ON COLUMN profiles.quality_filter_level IS 'Global quality filter for all activity views: all, quality, very_good, excellent';
|
||||||
|
|
||||||
|
-- Create index for performance (if filtering becomes common)
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_profiles_quality_filter ON profiles(quality_filter_level);
|
||||||
|
|
||||||
|
-- Migration tracking
|
||||||
|
DO $$
|
||||||
|
BEGIN
|
||||||
|
RAISE NOTICE '✓ Migration 016: Added global quality filter setting';
|
||||||
|
RAISE NOTICE ' - Added profiles.quality_filter_level column';
|
||||||
|
RAISE NOTICE ' - Default: all (no filter)';
|
||||||
|
RAISE NOTICE ' - Values: all, quality, very_good, excellent';
|
||||||
|
END $$;
|
||||||
22
backend/migrations/017_ai_prompts_flexibilisierung.sql
Normal file
22
backend/migrations/017_ai_prompts_flexibilisierung.sql
Normal file
|
|
@ -0,0 +1,22 @@
|
||||||
|
-- Migration 017: AI Prompts Flexibilisierung (Issue #28)
|
||||||
|
-- Add category column to ai_prompts for better organization and filtering
|
||||||
|
|
||||||
|
-- Add category column
|
||||||
|
ALTER TABLE ai_prompts ADD COLUMN IF NOT EXISTS category VARCHAR(20) DEFAULT 'ganzheitlich';
|
||||||
|
|
||||||
|
-- Create index for category filtering
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_ai_prompts_category ON ai_prompts(category);
|
||||||
|
|
||||||
|
-- Add comment
|
||||||
|
COMMENT ON COLUMN ai_prompts.category IS 'Prompt category: körper, ernährung, training, schlaf, vitalwerte, ziele, ganzheitlich';
|
||||||
|
|
||||||
|
-- Update existing prompts with appropriate categories
|
||||||
|
-- Based on slug patterns and content
|
||||||
|
UPDATE ai_prompts SET category = 'körper' WHERE slug IN ('koerperkomposition', 'gewichtstrend', 'umfaenge', 'caliper');
|
||||||
|
UPDATE ai_prompts SET category = 'ernährung' WHERE slug IN ('ernaehrung', 'kalorienbilanz', 'protein', 'makros');
|
||||||
|
UPDATE ai_prompts SET category = 'training' WHERE slug IN ('aktivitaet', 'trainingsanalyse', 'erholung', 'leistung');
|
||||||
|
UPDATE ai_prompts SET category = 'schlaf' WHERE slug LIKE '%schlaf%';
|
||||||
|
UPDATE ai_prompts SET category = 'vitalwerte' WHERE slug IN ('vitalwerte', 'herzfrequenz', 'ruhepuls', 'hrv');
|
||||||
|
UPDATE ai_prompts SET category = 'ziele' WHERE slug LIKE '%ziel%' OR slug LIKE '%goal%';
|
||||||
|
|
||||||
|
-- Pipeline prompts remain 'ganzheitlich' (default)
|
||||||
20
backend/migrations/018_prompt_display_name.sql
Normal file
20
backend/migrations/018_prompt_display_name.sql
Normal file
|
|
@ -0,0 +1,20 @@
|
||||||
|
-- Migration 018: Add display_name to ai_prompts for user-facing labels
|
||||||
|
|
||||||
|
ALTER TABLE ai_prompts ADD COLUMN IF NOT EXISTS display_name VARCHAR(100);
|
||||||
|
|
||||||
|
-- Migrate existing prompts from hardcoded SLUG_LABELS
|
||||||
|
UPDATE ai_prompts SET display_name = '🔍 Gesamtanalyse' WHERE slug = 'gesamt' AND display_name IS NULL;
|
||||||
|
UPDATE ai_prompts SET display_name = '🫧 Körperkomposition' WHERE slug = 'koerper' AND display_name IS NULL;
|
||||||
|
UPDATE ai_prompts SET display_name = '🍽️ Ernährung' WHERE slug = 'ernaehrung' AND display_name IS NULL;
|
||||||
|
UPDATE ai_prompts SET display_name = '🏋️ Aktivität' WHERE slug = 'aktivitaet' AND display_name IS NULL;
|
||||||
|
UPDATE ai_prompts SET display_name = '❤️ Gesundheitsindikatoren' WHERE slug = 'gesundheit' AND display_name IS NULL;
|
||||||
|
UPDATE ai_prompts SET display_name = '🎯 Zielfortschritt' WHERE slug = 'ziele' AND display_name IS NULL;
|
||||||
|
UPDATE ai_prompts SET display_name = '🔬 Mehrstufige Gesamtanalyse' WHERE slug = 'pipeline' AND display_name IS NULL;
|
||||||
|
UPDATE ai_prompts SET display_name = '🔬 Pipeline: Körper-Analyse (JSON)' WHERE slug = 'pipeline_body' AND display_name IS NULL;
|
||||||
|
UPDATE ai_prompts SET display_name = '🔬 Pipeline: Ernährungs-Analyse (JSON)' WHERE slug = 'pipeline_nutrition' AND display_name IS NULL;
|
||||||
|
UPDATE ai_prompts SET display_name = '🔬 Pipeline: Aktivitäts-Analyse (JSON)' WHERE slug = 'pipeline_activity' AND display_name IS NULL;
|
||||||
|
UPDATE ai_prompts SET display_name = '🔬 Pipeline: Synthese' WHERE slug = 'pipeline_synthesis' AND display_name IS NULL;
|
||||||
|
UPDATE ai_prompts SET display_name = '🔬 Pipeline: Zielabgleich' WHERE slug = 'pipeline_goals' AND display_name IS NULL;
|
||||||
|
|
||||||
|
-- Fallback: use name as display_name if still NULL
|
||||||
|
UPDATE ai_prompts SET display_name = name WHERE display_name IS NULL;
|
||||||
157
backend/migrations/019_pipeline_system.sql
Normal file
157
backend/migrations/019_pipeline_system.sql
Normal file
|
|
@ -0,0 +1,157 @@
|
||||||
|
-- Migration 019: Pipeline-System - Konfigurierbare mehrstufige Analysen
|
||||||
|
-- Ermöglicht Admin-Verwaltung von Pipeline-Konfigurationen (Issue #28)
|
||||||
|
-- Created: 2026-03-25
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- 1. Erweitere ai_prompts für Reset-Feature
|
||||||
|
-- ========================================
|
||||||
|
ALTER TABLE ai_prompts
|
||||||
|
ADD COLUMN IF NOT EXISTS is_system_default BOOLEAN DEFAULT FALSE,
|
||||||
|
ADD COLUMN IF NOT EXISTS default_template TEXT;
|
||||||
|
|
||||||
|
COMMENT ON COLUMN ai_prompts.is_system_default IS 'true = System-Prompt mit Reset-Funktion';
|
||||||
|
COMMENT ON COLUMN ai_prompts.default_template IS 'Original-Template für Reset-to-Default';
|
||||||
|
|
||||||
|
-- Markiere bestehende Pipeline-Prompts als System-Defaults
|
||||||
|
UPDATE ai_prompts
|
||||||
|
SET
|
||||||
|
is_system_default = true,
|
||||||
|
default_template = template
|
||||||
|
WHERE slug LIKE 'pipeline_%';
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- 2. Create pipeline_configs table
|
||||||
|
-- ========================================
|
||||||
|
CREATE TABLE IF NOT EXISTS pipeline_configs (
|
||||||
|
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
||||||
|
name VARCHAR(255) NOT NULL UNIQUE,
|
||||||
|
description TEXT,
|
||||||
|
is_default BOOLEAN DEFAULT FALSE,
|
||||||
|
active BOOLEAN DEFAULT TRUE,
|
||||||
|
|
||||||
|
-- Module configuration: which data sources to include
|
||||||
|
modules JSONB NOT NULL DEFAULT '{}'::jsonb,
|
||||||
|
-- Example: {"körper": true, "ernährung": true, "training": true, "schlaf": false}
|
||||||
|
|
||||||
|
-- Timeframes per module (days)
|
||||||
|
timeframes JSONB NOT NULL DEFAULT '{}'::jsonb,
|
||||||
|
-- Example: {"körper": 30, "ernährung": 30, "training": 14}
|
||||||
|
|
||||||
|
-- Stage 1 prompts (parallel execution)
|
||||||
|
stage1_prompts TEXT[] NOT NULL DEFAULT ARRAY[]::TEXT[],
|
||||||
|
-- Example: ARRAY['pipeline_body', 'pipeline_nutrition', 'pipeline_activity']
|
||||||
|
|
||||||
|
-- Stage 2 prompt (synthesis)
|
||||||
|
stage2_prompt VARCHAR(100) NOT NULL,
|
||||||
|
-- Example: 'pipeline_synthesis'
|
||||||
|
|
||||||
|
-- Stage 3 prompt (optional, e.g., goals)
|
||||||
|
stage3_prompt VARCHAR(100),
|
||||||
|
-- Example: 'pipeline_goals'
|
||||||
|
|
||||||
|
created TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- 3. Create indexes
|
||||||
|
-- ========================================
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_pipeline_configs_default ON pipeline_configs(is_default) WHERE is_default = true;
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_pipeline_configs_active ON pipeline_configs(active);
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- 4. Seed: Standard-Pipeline "Alltags-Check"
|
||||||
|
-- ========================================
|
||||||
|
INSERT INTO pipeline_configs (
|
||||||
|
name,
|
||||||
|
description,
|
||||||
|
is_default,
|
||||||
|
modules,
|
||||||
|
timeframes,
|
||||||
|
stage1_prompts,
|
||||||
|
stage2_prompt,
|
||||||
|
stage3_prompt
|
||||||
|
) VALUES (
|
||||||
|
'Alltags-Check',
|
||||||
|
'Standard-Analyse: Körper, Ernährung, Training über die letzten 2-4 Wochen',
|
||||||
|
true,
|
||||||
|
'{"körper": true, "ernährung": true, "training": true, "schlaf": false, "vitalwerte": false, "mentales": false, "ziele": false}'::jsonb,
|
||||||
|
'{"körper": 30, "ernährung": 30, "training": 14}'::jsonb,
|
||||||
|
ARRAY['pipeline_body', 'pipeline_nutrition', 'pipeline_activity'],
|
||||||
|
'pipeline_synthesis',
|
||||||
|
'pipeline_goals'
|
||||||
|
) ON CONFLICT (name) DO NOTHING;
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- 5. Seed: Erweiterte Pipelines (optional)
|
||||||
|
-- ========================================
|
||||||
|
|
||||||
|
-- Schlaf-Fokus Pipeline (wenn Schlaf-Prompts existieren)
|
||||||
|
INSERT INTO pipeline_configs (
|
||||||
|
name,
|
||||||
|
description,
|
||||||
|
is_default,
|
||||||
|
modules,
|
||||||
|
timeframes,
|
||||||
|
stage1_prompts,
|
||||||
|
stage2_prompt,
|
||||||
|
stage3_prompt
|
||||||
|
) VALUES (
|
||||||
|
'Schlaf & Erholung',
|
||||||
|
'Analyse von Schlaf, Vitalwerten und Erholungsstatus',
|
||||||
|
false,
|
||||||
|
'{"schlaf": true, "vitalwerte": true, "training": true, "körper": false, "ernährung": false, "mentales": false, "ziele": false}'::jsonb,
|
||||||
|
'{"schlaf": 14, "vitalwerte": 7, "training": 14}'::jsonb,
|
||||||
|
ARRAY['pipeline_sleep', 'pipeline_vitals', 'pipeline_activity'],
|
||||||
|
'pipeline_synthesis',
|
||||||
|
NULL
|
||||||
|
) ON CONFLICT (name) DO NOTHING;
|
||||||
|
|
||||||
|
-- Wettkampf-Analyse (langfristiger Trend)
|
||||||
|
INSERT INTO pipeline_configs (
|
||||||
|
name,
|
||||||
|
description,
|
||||||
|
is_default,
|
||||||
|
modules,
|
||||||
|
timeframes,
|
||||||
|
stage1_prompts,
|
||||||
|
stage2_prompt,
|
||||||
|
stage3_prompt
|
||||||
|
) VALUES (
|
||||||
|
'Wettkampf-Analyse',
|
||||||
|
'Langfristige Analyse für Wettkampfvorbereitung (90 Tage)',
|
||||||
|
false,
|
||||||
|
'{"körper": true, "training": true, "vitalwerte": true, "ernährung": true, "schlaf": false, "mentales": false, "ziele": true}'::jsonb,
|
||||||
|
'{"körper": 90, "training": 90, "vitalwerte": 30, "ernährung": 60}'::jsonb,
|
||||||
|
ARRAY['pipeline_body', 'pipeline_activity', 'pipeline_vitals', 'pipeline_nutrition'],
|
||||||
|
'pipeline_synthesis',
|
||||||
|
'pipeline_goals'
|
||||||
|
) ON CONFLICT (name) DO NOTHING;
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- 6. Trigger für updated timestamp
|
||||||
|
-- ========================================
|
||||||
|
DROP TRIGGER IF EXISTS trigger_pipeline_configs_updated ON pipeline_configs;
|
||||||
|
CREATE TRIGGER trigger_pipeline_configs_updated
|
||||||
|
BEFORE UPDATE ON pipeline_configs
|
||||||
|
FOR EACH ROW
|
||||||
|
EXECUTE FUNCTION update_updated_timestamp();
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- 7. Constraints & Validation
|
||||||
|
-- ========================================
|
||||||
|
|
||||||
|
-- Only one default config allowed (enforced via partial unique index)
|
||||||
|
CREATE UNIQUE INDEX IF NOT EXISTS idx_pipeline_configs_single_default
|
||||||
|
ON pipeline_configs(is_default)
|
||||||
|
WHERE is_default = true;
|
||||||
|
|
||||||
|
-- ========================================
|
||||||
|
-- 8. Comments (Documentation)
|
||||||
|
-- ========================================
|
||||||
|
COMMENT ON TABLE pipeline_configs IS 'v9f Issue #28: Konfigurierbare Pipeline-Analysen. Admins können mehrere Pipeline-Configs erstellen mit unterschiedlichen Modulen und Zeiträumen.';
|
||||||
|
COMMENT ON COLUMN pipeline_configs.modules IS 'JSONB: Welche Module aktiv sind (boolean flags)';
|
||||||
|
COMMENT ON COLUMN pipeline_configs.timeframes IS 'JSONB: Zeiträume pro Modul in Tagen';
|
||||||
|
COMMENT ON COLUMN pipeline_configs.stage1_prompts IS 'Array von slug-Werten für parallele Stage-1-Prompts';
|
||||||
|
COMMENT ON COLUMN pipeline_configs.stage2_prompt IS 'Slug des Synthese-Prompts (kombiniert Stage-1-Ergebnisse)';
|
||||||
|
COMMENT ON COLUMN pipeline_configs.stage3_prompt IS 'Optionaler Slug für Stage-3-Prompt (z.B. Zielabgleich)';
|
||||||
128
backend/migrations/020_unified_prompt_system.sql
Normal file
128
backend/migrations/020_unified_prompt_system.sql
Normal file
|
|
@ -0,0 +1,128 @@
|
||||||
|
-- Migration 020: Unified Prompt System (Issue #28)
|
||||||
|
-- Consolidate ai_prompts and pipeline_configs into single system
|
||||||
|
-- Type: 'base' (reusable building blocks) or 'pipeline' (workflows)
|
||||||
|
|
||||||
|
-- Step 1: Add new columns to ai_prompts and make template nullable
|
||||||
|
ALTER TABLE ai_prompts
|
||||||
|
ADD COLUMN IF NOT EXISTS type VARCHAR(20) DEFAULT 'pipeline',
|
||||||
|
ADD COLUMN IF NOT EXISTS stages JSONB,
|
||||||
|
ADD COLUMN IF NOT EXISTS output_format VARCHAR(10) DEFAULT 'text',
|
||||||
|
ADD COLUMN IF NOT EXISTS output_schema JSONB;
|
||||||
|
|
||||||
|
-- Make template nullable (pipeline-type prompts use stages instead)
|
||||||
|
ALTER TABLE ai_prompts
|
||||||
|
ALTER COLUMN template DROP NOT NULL;
|
||||||
|
|
||||||
|
-- Step 2: Migrate existing single-prompts to 1-stage pipeline format
|
||||||
|
-- All existing prompts become single-stage pipelines with inline source
|
||||||
|
UPDATE ai_prompts
|
||||||
|
SET
|
||||||
|
type = 'pipeline',
|
||||||
|
stages = jsonb_build_array(
|
||||||
|
jsonb_build_object(
|
||||||
|
'stage', 1,
|
||||||
|
'prompts', jsonb_build_array(
|
||||||
|
jsonb_build_object(
|
||||||
|
'source', 'inline',
|
||||||
|
'template', template,
|
||||||
|
'output_key', REPLACE(slug, 'pipeline_', ''),
|
||||||
|
'output_format', 'text'
|
||||||
|
)
|
||||||
|
)
|
||||||
|
)
|
||||||
|
),
|
||||||
|
output_format = 'text'
|
||||||
|
WHERE stages IS NULL;
|
||||||
|
|
||||||
|
-- Step 3: Migrate pipeline_configs into ai_prompts as multi-stage pipelines
|
||||||
|
-- Each pipeline_config becomes a pipeline-type prompt with multiple stages
|
||||||
|
INSERT INTO ai_prompts (
|
||||||
|
slug,
|
||||||
|
name,
|
||||||
|
description,
|
||||||
|
type,
|
||||||
|
stages,
|
||||||
|
output_format,
|
||||||
|
active,
|
||||||
|
is_system_default,
|
||||||
|
category
|
||||||
|
)
|
||||||
|
SELECT
|
||||||
|
'pipeline_config_' || LOWER(REPLACE(pc.name, ' ', '_')) || '_' || SUBSTRING(pc.id::TEXT FROM 1 FOR 8) as slug,
|
||||||
|
pc.name,
|
||||||
|
pc.description,
|
||||||
|
'pipeline' as type,
|
||||||
|
-- Build stages JSONB: combine stage1_prompts, stage2_prompt, stage3_prompt
|
||||||
|
(
|
||||||
|
-- Stage 1: Convert array to prompts
|
||||||
|
SELECT jsonb_agg(stage_obj ORDER BY stage_num)
|
||||||
|
FROM (
|
||||||
|
SELECT 1 as stage_num,
|
||||||
|
jsonb_build_object(
|
||||||
|
'stage', 1,
|
||||||
|
'prompts', (
|
||||||
|
SELECT jsonb_agg(
|
||||||
|
jsonb_build_object(
|
||||||
|
'source', 'reference',
|
||||||
|
'slug', s1.slug_val,
|
||||||
|
'output_key', REPLACE(s1.slug_val, 'pipeline_', 'stage1_'),
|
||||||
|
'output_format', 'json'
|
||||||
|
)
|
||||||
|
)
|
||||||
|
FROM UNNEST(pc.stage1_prompts) AS s1(slug_val)
|
||||||
|
)
|
||||||
|
) as stage_obj
|
||||||
|
WHERE array_length(pc.stage1_prompts, 1) > 0
|
||||||
|
|
||||||
|
UNION ALL
|
||||||
|
|
||||||
|
SELECT 2 as stage_num,
|
||||||
|
jsonb_build_object(
|
||||||
|
'stage', 2,
|
||||||
|
'prompts', jsonb_build_array(
|
||||||
|
jsonb_build_object(
|
||||||
|
'source', 'reference',
|
||||||
|
'slug', pc.stage2_prompt,
|
||||||
|
'output_key', 'synthesis',
|
||||||
|
'output_format', 'text'
|
||||||
|
)
|
||||||
|
)
|
||||||
|
) as stage_obj
|
||||||
|
WHERE pc.stage2_prompt IS NOT NULL
|
||||||
|
|
||||||
|
UNION ALL
|
||||||
|
|
||||||
|
SELECT 3 as stage_num,
|
||||||
|
jsonb_build_object(
|
||||||
|
'stage', 3,
|
||||||
|
'prompts', jsonb_build_array(
|
||||||
|
jsonb_build_object(
|
||||||
|
'source', 'reference',
|
||||||
|
'slug', pc.stage3_prompt,
|
||||||
|
'output_key', 'goals',
|
||||||
|
'output_format', 'text'
|
||||||
|
)
|
||||||
|
)
|
||||||
|
) as stage_obj
|
||||||
|
WHERE pc.stage3_prompt IS NOT NULL
|
||||||
|
) all_stages
|
||||||
|
) as stages,
|
||||||
|
'text' as output_format,
|
||||||
|
pc.active,
|
||||||
|
pc.is_default as is_system_default,
|
||||||
|
'pipeline' as category
|
||||||
|
FROM pipeline_configs pc;
|
||||||
|
|
||||||
|
-- Step 4: Add indices for performance
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_ai_prompts_type ON ai_prompts(type);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_ai_prompts_stages ON ai_prompts USING GIN (stages);
|
||||||
|
|
||||||
|
-- Step 5: Add comment explaining stages structure
|
||||||
|
COMMENT ON COLUMN ai_prompts.stages IS 'JSONB array of stages, each with prompts array. Structure: [{"stage":1,"prompts":[{"source":"reference|inline","slug":"...","template":"...","output_key":"key","output_format":"text|json"}]}]';
|
||||||
|
|
||||||
|
-- Step 6: Backup pipeline_configs before eventual deletion
|
||||||
|
CREATE TABLE IF NOT EXISTS pipeline_configs_backup_pre_020 AS
|
||||||
|
SELECT * FROM pipeline_configs;
|
||||||
|
|
||||||
|
-- Note: We keep pipeline_configs table for now during transition period
|
||||||
|
-- It can be dropped in a later migration once all code is migrated
|
||||||
7
backend/migrations/021_ai_insights_metadata.sql
Normal file
7
backend/migrations/021_ai_insights_metadata.sql
Normal file
|
|
@ -0,0 +1,7 @@
|
||||||
|
-- Migration 021: Add metadata column to ai_insights for storing debug info
|
||||||
|
-- Date: 2026-03-26
|
||||||
|
-- Purpose: Store resolved placeholder values with descriptions for transparency
|
||||||
|
|
||||||
|
ALTER TABLE ai_insights ADD COLUMN IF NOT EXISTS metadata JSONB DEFAULT NULL;
|
||||||
|
|
||||||
|
COMMENT ON COLUMN ai_insights.metadata IS 'Debug info: resolved placeholders, descriptions, etc.';
|
||||||
135
backend/migrations/022_goal_system.sql
Normal file
135
backend/migrations/022_goal_system.sql
Normal file
|
|
@ -0,0 +1,135 @@
|
||||||
|
-- Migration 022: Goal System (Strategic + Tactical)
|
||||||
|
-- Date: 2026-03-26
|
||||||
|
-- Purpose: Two-level goal architecture for AI-driven coaching
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- STRATEGIC LAYER: Goal Modes
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- Add goal_mode to profiles (strategic training direction)
|
||||||
|
ALTER TABLE profiles ADD COLUMN IF NOT EXISTS goal_mode VARCHAR(50) DEFAULT 'health';
|
||||||
|
|
||||||
|
COMMENT ON COLUMN profiles.goal_mode IS
|
||||||
|
'Strategic goal mode: weight_loss, strength, endurance, recomposition, health.
|
||||||
|
Determines score weights and interpretation context for all analyses.';
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- TACTICAL LAYER: Concrete Goal Targets
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS goals (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
|
||||||
|
-- Goal Classification
|
||||||
|
goal_type VARCHAR(50) NOT NULL, -- weight, body_fat, lean_mass, vo2max, strength, flexibility, bp, rhr
|
||||||
|
is_primary BOOLEAN DEFAULT false,
|
||||||
|
status VARCHAR(20) DEFAULT 'active', -- draft, active, reached, abandoned, expired
|
||||||
|
|
||||||
|
-- Target Values
|
||||||
|
target_value DECIMAL(10,2),
|
||||||
|
current_value DECIMAL(10,2),
|
||||||
|
start_value DECIMAL(10,2),
|
||||||
|
unit VARCHAR(20), -- kg, %, ml/kg/min, bpm, mmHg, cm, reps
|
||||||
|
|
||||||
|
-- Timeline
|
||||||
|
start_date DATE DEFAULT CURRENT_DATE,
|
||||||
|
target_date DATE,
|
||||||
|
reached_date DATE,
|
||||||
|
|
||||||
|
-- Metadata
|
||||||
|
name VARCHAR(100), -- e.g., "Sommerfigur 2026"
|
||||||
|
description TEXT,
|
||||||
|
|
||||||
|
-- Progress Tracking
|
||||||
|
progress_pct DECIMAL(5,2), -- Auto-calculated: (current - start) / (target - start) * 100
|
||||||
|
projection_date DATE, -- Prognose wann Ziel erreicht wird
|
||||||
|
on_track BOOLEAN, -- true wenn Prognose <= target_date
|
||||||
|
|
||||||
|
-- Timestamps
|
||||||
|
created_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
updated_at TIMESTAMP DEFAULT NOW()
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_goals_profile ON goals(profile_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_goals_status ON goals(profile_id, status);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_goals_primary ON goals(profile_id, is_primary) WHERE is_primary = true;
|
||||||
|
|
||||||
|
COMMENT ON TABLE goals IS 'Concrete user goals (tactical targets)';
|
||||||
|
COMMENT ON COLUMN goals.goal_type IS 'Type of goal: weight, body_fat, lean_mass, vo2max, strength, flexibility, bp, rhr';
|
||||||
|
COMMENT ON COLUMN goals.is_primary IS 'Primary goal gets highest priority in scoring and charts';
|
||||||
|
COMMENT ON COLUMN goals.status IS 'draft = not yet started, active = in progress, reached = successfully completed, abandoned = given up, expired = deadline passed';
|
||||||
|
COMMENT ON COLUMN goals.progress_pct IS 'Percentage progress: (current_value - start_value) / (target_value - start_value) * 100';
|
||||||
|
COMMENT ON COLUMN goals.projection_date IS 'Projected date when goal will be reached based on current trend';
|
||||||
|
COMMENT ON COLUMN goals.on_track IS 'true if projection_date <= target_date (goal reachable on time)';
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- TRAINING PHASES (Auto-Detection)
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS training_phases (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
|
||||||
|
-- Phase Classification
|
||||||
|
phase_type VARCHAR(50) NOT NULL, -- calorie_deficit, calorie_surplus, deload, maintenance, periodization
|
||||||
|
detected_automatically BOOLEAN DEFAULT false,
|
||||||
|
confidence_score DECIMAL(3,2), -- 0.00 - 1.00 (Wie sicher ist die Erkennung?)
|
||||||
|
status VARCHAR(20) DEFAULT 'suggested', -- suggested, accepted, active, completed, rejected
|
||||||
|
|
||||||
|
-- Timeframe
|
||||||
|
start_date DATE NOT NULL,
|
||||||
|
end_date DATE,
|
||||||
|
duration_days INT,
|
||||||
|
|
||||||
|
-- Detection Criteria (JSONB für Flexibilität)
|
||||||
|
detection_params JSONB, -- { "avg_calories": 1800, "weight_trend": -0.3, ... }
|
||||||
|
|
||||||
|
-- User Notes
|
||||||
|
notes TEXT,
|
||||||
|
|
||||||
|
-- Timestamps
|
||||||
|
created_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
updated_at TIMESTAMP DEFAULT NOW()
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_training_phases_profile ON training_phases(profile_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_training_phases_status ON training_phases(profile_id, status);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_training_phases_dates ON training_phases(profile_id, start_date, end_date);
|
||||||
|
|
||||||
|
COMMENT ON TABLE training_phases IS 'Training phases detected from data patterns or manually defined';
|
||||||
|
COMMENT ON COLUMN training_phases.phase_type IS 'calorie_deficit, calorie_surplus, deload, maintenance, periodization';
|
||||||
|
COMMENT ON COLUMN training_phases.detected_automatically IS 'true if AI detected this phase from data patterns';
|
||||||
|
COMMENT ON COLUMN training_phases.confidence_score IS 'AI confidence in detection (0.0 - 1.0)';
|
||||||
|
COMMENT ON COLUMN training_phases.status IS 'suggested = AI proposed, accepted = user confirmed, active = currently running, completed = finished, rejected = user dismissed';
|
||||||
|
COMMENT ON COLUMN training_phases.detection_params IS 'JSON with detection criteria: avg_calories, weight_trend, activity_volume, etc.';
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- FITNESS TESTS (Standardized Performance Tests)
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS fitness_tests (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
|
||||||
|
-- Test Type
|
||||||
|
test_type VARCHAR(50) NOT NULL, -- cooper_12min, step_test, pushups_max, plank_max, flexibility_sit_reach, vo2max_est, strength_1rm_squat, strength_1rm_bench
|
||||||
|
result_value DECIMAL(10,2) NOT NULL,
|
||||||
|
result_unit VARCHAR(20) NOT NULL, -- meters, bpm, reps, seconds, cm, ml/kg/min, kg
|
||||||
|
|
||||||
|
-- Test Metadata
|
||||||
|
test_date DATE NOT NULL,
|
||||||
|
test_conditions TEXT, -- Optional: Notizen zu Bedingungen
|
||||||
|
norm_category VARCHAR(30), -- sehr gut, gut, durchschnitt, unterdurchschnitt, schlecht
|
||||||
|
|
||||||
|
-- Timestamps
|
||||||
|
created_at TIMESTAMP DEFAULT NOW()
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_fitness_tests_profile ON fitness_tests(profile_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_fitness_tests_type ON fitness_tests(profile_id, test_type);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_fitness_tests_date ON fitness_tests(profile_id, test_date);
|
||||||
|
|
||||||
|
COMMENT ON TABLE fitness_tests IS 'Standardized fitness tests (Cooper, step test, strength tests, etc.)';
|
||||||
|
COMMENT ON COLUMN fitness_tests.test_type IS 'cooper_12min, step_test, pushups_max, plank_max, flexibility_sit_reach, vo2max_est, strength_1rm_squat, strength_1rm_bench';
|
||||||
|
COMMENT ON COLUMN fitness_tests.norm_category IS 'Performance category based on age/gender norms';
|
||||||
185
backend/migrations/024_goal_type_registry.sql
Normal file
185
backend/migrations/024_goal_type_registry.sql
Normal file
|
|
@ -0,0 +1,185 @@
|
||||||
|
-- Migration 024: Goal Type Registry (Flexible Goal System)
|
||||||
|
-- Date: 2026-03-27
|
||||||
|
-- Purpose: Enable dynamic goal types without code changes
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Goal Type Definitions
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS goal_type_definitions (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
|
||||||
|
-- Unique identifier (used in code)
|
||||||
|
type_key VARCHAR(50) UNIQUE NOT NULL,
|
||||||
|
|
||||||
|
-- Display metadata
|
||||||
|
label_de VARCHAR(100) NOT NULL,
|
||||||
|
label_en VARCHAR(100),
|
||||||
|
unit VARCHAR(20) NOT NULL,
|
||||||
|
icon VARCHAR(10),
|
||||||
|
category VARCHAR(50), -- body, mind, activity, nutrition, recovery, custom
|
||||||
|
|
||||||
|
-- Data source configuration
|
||||||
|
source_table VARCHAR(50), -- Which table to query
|
||||||
|
source_column VARCHAR(50), -- Which column to fetch
|
||||||
|
aggregation_method VARCHAR(20), -- How to aggregate: latest, avg_7d, avg_30d, sum_30d, count_7d, count_30d, min_30d, max_30d
|
||||||
|
|
||||||
|
-- Complex calculations (optional)
|
||||||
|
-- For types like lean_mass that need custom logic
|
||||||
|
-- JSON format: {"type": "formula", "dependencies": ["weight", "body_fat"], "expression": "..."}
|
||||||
|
calculation_formula TEXT,
|
||||||
|
|
||||||
|
-- Metadata
|
||||||
|
description TEXT,
|
||||||
|
is_active BOOLEAN DEFAULT true,
|
||||||
|
is_system BOOLEAN DEFAULT false, -- System types cannot be deleted
|
||||||
|
|
||||||
|
-- Audit
|
||||||
|
created_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
updated_at TIMESTAMP DEFAULT NOW()
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_goal_type_definitions_active ON goal_type_definitions(is_active) WHERE is_active = true;
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_goal_type_definitions_category ON goal_type_definitions(category);
|
||||||
|
|
||||||
|
COMMENT ON TABLE goal_type_definitions IS 'Registry of available goal types - allows dynamic goal creation without code changes';
|
||||||
|
COMMENT ON COLUMN goal_type_definitions.type_key IS 'Unique key used in code (e.g., weight, meditation_minutes)';
|
||||||
|
COMMENT ON COLUMN goal_type_definitions.aggregation_method IS 'latest = most recent value, avg_7d = 7-day average, count_7d = count in last 7 days, etc.';
|
||||||
|
COMMENT ON COLUMN goal_type_definitions.calculation_formula IS 'JSON for complex calculations like lean_mass = weight - (weight * bf_pct / 100)';
|
||||||
|
COMMENT ON COLUMN goal_type_definitions.is_system IS 'System types are protected from deletion (core functionality)';
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Seed Data: Migrate existing 8 goal types
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- 1. Weight (simple - latest value)
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, label_en, unit, icon, category,
|
||||||
|
source_table, source_column, aggregation_method,
|
||||||
|
description, is_system
|
||||||
|
) VALUES (
|
||||||
|
'weight', 'Gewicht', 'Weight', 'kg', '⚖️', 'body',
|
||||||
|
'weight_log', 'weight', 'latest',
|
||||||
|
'Aktuelles Körpergewicht', true
|
||||||
|
)
|
||||||
|
ON CONFLICT (type_key) DO NOTHING;
|
||||||
|
|
||||||
|
-- 2. Body Fat (simple - latest value)
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, label_en, unit, icon, category,
|
||||||
|
source_table, source_column, aggregation_method,
|
||||||
|
description, is_system
|
||||||
|
) VALUES (
|
||||||
|
'body_fat', 'Körperfett', 'Body Fat', '%', '📊', 'body',
|
||||||
|
'caliper_log', 'body_fat_pct', 'latest',
|
||||||
|
'Körperfettanteil aus Caliper-Messung', true
|
||||||
|
)
|
||||||
|
ON CONFLICT (type_key) DO NOTHING;
|
||||||
|
|
||||||
|
-- 3. Lean Mass (complex - calculation formula)
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, label_en, unit, icon, category,
|
||||||
|
calculation_formula,
|
||||||
|
description, is_system
|
||||||
|
) VALUES (
|
||||||
|
'lean_mass', 'Muskelmasse', 'Lean Mass', 'kg', '💪', 'body',
|
||||||
|
'{"type": "lean_mass", "dependencies": ["weight_log.weight", "caliper_log.body_fat_pct"], "formula": "weight - (weight * body_fat_pct / 100)"}',
|
||||||
|
'Fettfreie Körpermasse (berechnet aus Gewicht und Körperfett)', true
|
||||||
|
)
|
||||||
|
ON CONFLICT (type_key) DO NOTHING;
|
||||||
|
|
||||||
|
-- 4. VO2 Max (simple - latest value)
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, label_en, unit, icon, category,
|
||||||
|
source_table, source_column, aggregation_method,
|
||||||
|
description, is_system
|
||||||
|
) VALUES (
|
||||||
|
'vo2max', 'VO2Max', 'VO2Max', 'ml/kg/min', '🫁', 'recovery',
|
||||||
|
'vitals_baseline', 'vo2_max', 'latest',
|
||||||
|
'Maximale Sauerstoffaufnahme (geschätzt oder gemessen)', true
|
||||||
|
)
|
||||||
|
ON CONFLICT (type_key) DO NOTHING;
|
||||||
|
|
||||||
|
-- 5. Resting Heart Rate (simple - latest value)
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, label_en, unit, icon, category,
|
||||||
|
source_table, source_column, aggregation_method,
|
||||||
|
description, is_system
|
||||||
|
) VALUES (
|
||||||
|
'rhr', 'Ruhepuls', 'Resting Heart Rate', 'bpm', '💓', 'recovery',
|
||||||
|
'vitals_baseline', 'resting_hr', 'latest',
|
||||||
|
'Ruhepuls morgens vor dem Aufstehen', true
|
||||||
|
)
|
||||||
|
ON CONFLICT (type_key) DO NOTHING;
|
||||||
|
|
||||||
|
-- 6. Blood Pressure (placeholder - compound goal for v2.0)
|
||||||
|
-- Currently limited to single value, v2.0 will support systolic/diastolic
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, label_en, unit, icon, category,
|
||||||
|
source_table, source_column, aggregation_method,
|
||||||
|
description, is_system
|
||||||
|
) VALUES (
|
||||||
|
'bp', 'Blutdruck', 'Blood Pressure', 'mmHg', '❤️', 'recovery',
|
||||||
|
'blood_pressure_log', 'systolic', 'latest',
|
||||||
|
'Blutdruck (aktuell nur systolisch, v2.0: beide Werte)', true
|
||||||
|
)
|
||||||
|
ON CONFLICT (type_key) DO NOTHING;
|
||||||
|
|
||||||
|
-- 7. Strength (placeholder - no data source yet)
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, label_en, unit, icon, category,
|
||||||
|
description, is_system, is_active
|
||||||
|
) VALUES (
|
||||||
|
'strength', 'Kraft', 'Strength', 'kg', '🏋️', 'activity',
|
||||||
|
'Maximalkraft (Platzhalter, Datenquelle in v2.0)', true, false
|
||||||
|
)
|
||||||
|
ON CONFLICT (type_key) DO NOTHING;
|
||||||
|
|
||||||
|
-- 8. Flexibility (placeholder - no data source yet)
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, label_en, unit, icon, category,
|
||||||
|
description, is_system, is_active
|
||||||
|
) VALUES (
|
||||||
|
'flexibility', 'Beweglichkeit', 'Flexibility', 'cm', '🤸', 'activity',
|
||||||
|
'Beweglichkeit (Platzhalter, Datenquelle in v2.0)', true, false
|
||||||
|
)
|
||||||
|
ON CONFLICT (type_key) DO NOTHING;
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Example: Future custom goal types (commented out, for reference)
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
/*
|
||||||
|
-- Meditation Minutes (avg last 7 days)
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, unit, icon, category,
|
||||||
|
source_table, source_column, aggregation_method,
|
||||||
|
description, is_system
|
||||||
|
) VALUES (
|
||||||
|
'meditation_minutes', 'Meditation', 'min/Tag', '🧘', 'mind',
|
||||||
|
'meditation_log', 'duration_minutes', 'avg_7d',
|
||||||
|
'Durchschnittliche Meditationsdauer pro Tag (7 Tage)', false
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Training Frequency (count last 7 days)
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, unit, icon, category,
|
||||||
|
source_table, source_column, aggregation_method,
|
||||||
|
description, is_system
|
||||||
|
) VALUES (
|
||||||
|
'training_frequency', 'Trainingshäufigkeit', 'x/Woche', '📅', 'activity',
|
||||||
|
'activity_log', 'id', 'count_7d',
|
||||||
|
'Anzahl Trainingseinheiten pro Woche', false
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Sleep Quality (avg last 7 days)
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, unit, icon, category,
|
||||||
|
source_table, source_column, aggregation_method,
|
||||||
|
description, is_system
|
||||||
|
) VALUES (
|
||||||
|
'sleep_quality', 'Schlafqualität', '%', '💤', 'recovery',
|
||||||
|
'sleep_log', 'quality_score', 'avg_7d',
|
||||||
|
'Durchschnittliche Schlafqualität (Deep+REM Anteil)', false
|
||||||
|
);
|
||||||
|
*/
|
||||||
103
backend/migrations/025_cleanup_goal_type_definitions.sql
Normal file
103
backend/migrations/025_cleanup_goal_type_definitions.sql
Normal file
|
|
@ -0,0 +1,103 @@
|
||||||
|
-- Migration 025: Cleanup goal_type_definitions
|
||||||
|
-- Date: 2026-03-27
|
||||||
|
-- Purpose: Remove problematic FK columns and ensure seed data
|
||||||
|
|
||||||
|
-- Remove created_by/updated_by columns if they exist
|
||||||
|
-- (May have been created by failed Migration 024)
|
||||||
|
ALTER TABLE goal_type_definitions DROP COLUMN IF EXISTS created_by;
|
||||||
|
ALTER TABLE goal_type_definitions DROP COLUMN IF EXISTS updated_by;
|
||||||
|
|
||||||
|
-- Re-insert seed data (ON CONFLICT ensures idempotency)
|
||||||
|
-- This fixes cases where Migration 024 created table but failed to seed
|
||||||
|
|
||||||
|
-- 1. Weight
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, label_en, unit, icon, category,
|
||||||
|
source_table, source_column, aggregation_method,
|
||||||
|
description, is_system
|
||||||
|
) VALUES (
|
||||||
|
'weight', 'Gewicht', 'Weight', 'kg', '⚖️', 'body',
|
||||||
|
'weight_log', 'weight', 'latest',
|
||||||
|
'Aktuelles Körpergewicht', true
|
||||||
|
)
|
||||||
|
ON CONFLICT (type_key) DO NOTHING;
|
||||||
|
|
||||||
|
-- 2. Body Fat
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, label_en, unit, icon, category,
|
||||||
|
source_table, source_column, aggregation_method,
|
||||||
|
description, is_system
|
||||||
|
) VALUES (
|
||||||
|
'body_fat', 'Körperfett', 'Body Fat', '%', '📊', 'body',
|
||||||
|
'caliper_log', 'body_fat_pct', 'latest',
|
||||||
|
'Körperfettanteil aus Caliper-Messung', true
|
||||||
|
)
|
||||||
|
ON CONFLICT (type_key) DO NOTHING;
|
||||||
|
|
||||||
|
-- 3. Lean Mass
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, label_en, unit, icon, category,
|
||||||
|
calculation_formula,
|
||||||
|
description, is_system
|
||||||
|
) VALUES (
|
||||||
|
'lean_mass', 'Muskelmasse', 'Lean Mass', 'kg', '💪', 'body',
|
||||||
|
'{"type": "lean_mass", "dependencies": ["weight_log.weight", "caliper_log.body_fat_pct"], "formula": "weight - (weight * body_fat_pct / 100)"}',
|
||||||
|
'Fettfreie Körpermasse (berechnet aus Gewicht und Körperfett)', true
|
||||||
|
)
|
||||||
|
ON CONFLICT (type_key) DO NOTHING;
|
||||||
|
|
||||||
|
-- 4. VO2 Max
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, label_en, unit, icon, category,
|
||||||
|
source_table, source_column, aggregation_method,
|
||||||
|
description, is_system
|
||||||
|
) VALUES (
|
||||||
|
'vo2max', 'VO2Max', 'VO2Max', 'ml/kg/min', '🫁', 'recovery',
|
||||||
|
'vitals_baseline', 'vo2_max', 'latest',
|
||||||
|
'Maximale Sauerstoffaufnahme (geschätzt oder gemessen)', true
|
||||||
|
)
|
||||||
|
ON CONFLICT (type_key) DO NOTHING;
|
||||||
|
|
||||||
|
-- 5. Resting Heart Rate
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, label_en, unit, icon, category,
|
||||||
|
source_table, source_column, aggregation_method,
|
||||||
|
description, is_system
|
||||||
|
) VALUES (
|
||||||
|
'rhr', 'Ruhepuls', 'Resting Heart Rate', 'bpm', '💓', 'recovery',
|
||||||
|
'vitals_baseline', 'resting_hr', 'latest',
|
||||||
|
'Ruhepuls morgens vor dem Aufstehen', true
|
||||||
|
)
|
||||||
|
ON CONFLICT (type_key) DO NOTHING;
|
||||||
|
|
||||||
|
-- 6. Blood Pressure
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, label_en, unit, icon, category,
|
||||||
|
source_table, source_column, aggregation_method,
|
||||||
|
description, is_system
|
||||||
|
) VALUES (
|
||||||
|
'bp', 'Blutdruck', 'Blood Pressure', 'mmHg', '❤️', 'recovery',
|
||||||
|
'blood_pressure_log', 'systolic', 'latest',
|
||||||
|
'Blutdruck (aktuell nur systolisch, v2.0: beide Werte)', true
|
||||||
|
)
|
||||||
|
ON CONFLICT (type_key) DO NOTHING;
|
||||||
|
|
||||||
|
-- 7. Strength (inactive placeholder)
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, label_en, unit, icon, category,
|
||||||
|
description, is_system, is_active
|
||||||
|
) VALUES (
|
||||||
|
'strength', 'Kraft', 'Strength', 'kg', '🏋️', 'activity',
|
||||||
|
'Maximalkraft (Platzhalter, Datenquelle in v2.0)', true, false
|
||||||
|
)
|
||||||
|
ON CONFLICT (type_key) DO NOTHING;
|
||||||
|
|
||||||
|
-- 8. Flexibility (inactive placeholder)
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, label_en, unit, icon, category,
|
||||||
|
description, is_system, is_active
|
||||||
|
) VALUES (
|
||||||
|
'flexibility', 'Beweglichkeit', 'Flexibility', 'cm', '🤸', 'activity',
|
||||||
|
'Beweglichkeit (Platzhalter, Datenquelle in v2.0)', true, false
|
||||||
|
)
|
||||||
|
ON CONFLICT (type_key) DO NOTHING;
|
||||||
40
backend/migrations/026_goal_type_filters.sql
Normal file
40
backend/migrations/026_goal_type_filters.sql
Normal file
|
|
@ -0,0 +1,40 @@
|
||||||
|
-- Migration 026: Goal Type Filters
|
||||||
|
-- Date: 2026-03-27
|
||||||
|
-- Purpose: Enable filtered counting/aggregation (e.g., count only strength training)
|
||||||
|
|
||||||
|
-- Add filter_conditions column for flexible filtering
|
||||||
|
ALTER TABLE goal_type_definitions
|
||||||
|
ADD COLUMN IF NOT EXISTS filter_conditions JSONB;
|
||||||
|
|
||||||
|
COMMENT ON COLUMN goal_type_definitions.filter_conditions IS
|
||||||
|
'Optional filter conditions as JSON. Example: {"training_type": "strength"} to count only strength training sessions.
|
||||||
|
Supports any column in the source table. Format: {"column_name": "value"} or {"column_name": ["value1", "value2"]} for IN clause.';
|
||||||
|
|
||||||
|
-- Example usage (commented out):
|
||||||
|
/*
|
||||||
|
-- Count only strength training sessions per week
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, unit, icon, category,
|
||||||
|
source_table, source_column, aggregation_method,
|
||||||
|
filter_conditions,
|
||||||
|
description, is_system
|
||||||
|
) VALUES (
|
||||||
|
'strength_frequency', 'Krafttraining Häufigkeit', 'x/Woche', '🏋️', 'activity',
|
||||||
|
'activity_log', 'id', 'count_7d',
|
||||||
|
'{"training_type": "strength"}',
|
||||||
|
'Anzahl Krafttraining-Einheiten pro Woche', false
|
||||||
|
) ON CONFLICT (type_key) DO NOTHING;
|
||||||
|
|
||||||
|
-- Count only cardio sessions per week
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, unit, icon, category,
|
||||||
|
source_table, source_column, aggregation_method,
|
||||||
|
filter_conditions,
|
||||||
|
description, is_system
|
||||||
|
) VALUES (
|
||||||
|
'cardio_frequency', 'Cardio Häufigkeit', 'x/Woche', '🏃', 'activity',
|
||||||
|
'activity_log', 'id', 'count_7d',
|
||||||
|
'{"training_type": "cardio"}',
|
||||||
|
'Anzahl Cardio-Einheiten pro Woche', false
|
||||||
|
) ON CONFLICT (type_key) DO NOTHING;
|
||||||
|
*/
|
||||||
125
backend/migrations/027_focus_areas_system.sql
Normal file
125
backend/migrations/027_focus_areas_system.sql
Normal file
|
|
@ -0,0 +1,125 @@
|
||||||
|
-- Migration 027: Focus Areas System (Goal System v2.0)
|
||||||
|
-- Date: 2026-03-27
|
||||||
|
-- Purpose: Replace single primary goal with weighted multi-goal system
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Focus Areas Table
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS focus_areas (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
|
||||||
|
-- Six focus dimensions (percentages, sum = 100)
|
||||||
|
weight_loss_pct INTEGER DEFAULT 0 CHECK (weight_loss_pct >= 0 AND weight_loss_pct <= 100),
|
||||||
|
muscle_gain_pct INTEGER DEFAULT 0 CHECK (muscle_gain_pct >= 0 AND muscle_gain_pct <= 100),
|
||||||
|
strength_pct INTEGER DEFAULT 0 CHECK (strength_pct >= 0 AND strength_pct <= 100),
|
||||||
|
endurance_pct INTEGER DEFAULT 0 CHECK (endurance_pct >= 0 AND endurance_pct <= 100),
|
||||||
|
flexibility_pct INTEGER DEFAULT 0 CHECK (flexibility_pct >= 0 AND flexibility_pct <= 100),
|
||||||
|
health_pct INTEGER DEFAULT 0 CHECK (health_pct >= 0 AND health_pct <= 100),
|
||||||
|
|
||||||
|
-- Status
|
||||||
|
active BOOLEAN DEFAULT true,
|
||||||
|
|
||||||
|
-- Audit
|
||||||
|
created_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
updated_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
|
||||||
|
-- Constraints
|
||||||
|
CONSTRAINT sum_equals_100 CHECK (
|
||||||
|
weight_loss_pct + muscle_gain_pct + strength_pct +
|
||||||
|
endurance_pct + flexibility_pct + health_pct = 100
|
||||||
|
)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Only one active focus_areas per profile
|
||||||
|
CREATE UNIQUE INDEX IF NOT EXISTS idx_focus_areas_profile_active
|
||||||
|
ON focus_areas(profile_id) WHERE active = true;
|
||||||
|
|
||||||
|
COMMENT ON TABLE focus_areas IS 'User-defined focus area weights (replaces simple goal_mode). Enables multi-goal prioritization with custom percentages.';
|
||||||
|
COMMENT ON COLUMN focus_areas.weight_loss_pct IS 'Focus on fat loss (0-100%)';
|
||||||
|
COMMENT ON COLUMN focus_areas.muscle_gain_pct IS 'Focus on muscle growth (0-100%)';
|
||||||
|
COMMENT ON COLUMN focus_areas.strength_pct IS 'Focus on strength gains (0-100%)';
|
||||||
|
COMMENT ON COLUMN focus_areas.endurance_pct IS 'Focus on aerobic capacity (0-100%)';
|
||||||
|
COMMENT ON COLUMN focus_areas.flexibility_pct IS 'Focus on mobility/flexibility (0-100%)';
|
||||||
|
COMMENT ON COLUMN focus_areas.health_pct IS 'Focus on general health (0-100%)';
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Migrate existing goal_mode to focus_areas
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- For each profile with a goal_mode, create initial focus_areas
|
||||||
|
INSERT INTO focus_areas (
|
||||||
|
profile_id,
|
||||||
|
weight_loss_pct, muscle_gain_pct, strength_pct,
|
||||||
|
endurance_pct, flexibility_pct, health_pct
|
||||||
|
)
|
||||||
|
SELECT
|
||||||
|
id AS profile_id,
|
||||||
|
CASE goal_mode
|
||||||
|
WHEN 'weight_loss' THEN 60
|
||||||
|
WHEN 'recomposition' THEN 30
|
||||||
|
WHEN 'health' THEN 5
|
||||||
|
ELSE 0
|
||||||
|
END AS weight_loss_pct,
|
||||||
|
|
||||||
|
CASE goal_mode
|
||||||
|
WHEN 'strength' THEN 40 ELSE 0
|
||||||
|
END +
|
||||||
|
CASE goal_mode
|
||||||
|
WHEN 'recomposition' THEN 30 ELSE 0
|
||||||
|
END AS muscle_gain_pct,
|
||||||
|
|
||||||
|
CASE goal_mode
|
||||||
|
WHEN 'strength' THEN 50
|
||||||
|
WHEN 'recomposition' THEN 25
|
||||||
|
WHEN 'weight_loss' THEN 10
|
||||||
|
WHEN 'health' THEN 10
|
||||||
|
ELSE 0
|
||||||
|
END AS strength_pct,
|
||||||
|
|
||||||
|
CASE goal_mode
|
||||||
|
WHEN 'endurance' THEN 70
|
||||||
|
WHEN 'recomposition' THEN 10
|
||||||
|
WHEN 'weight_loss' THEN 20
|
||||||
|
WHEN 'health' THEN 20
|
||||||
|
ELSE 0
|
||||||
|
END AS endurance_pct,
|
||||||
|
|
||||||
|
CASE goal_mode
|
||||||
|
WHEN 'endurance' THEN 10 ELSE 0
|
||||||
|
END +
|
||||||
|
CASE goal_mode
|
||||||
|
WHEN 'health' THEN 15 ELSE 0
|
||||||
|
END +
|
||||||
|
CASE goal_mode
|
||||||
|
WHEN 'recomposition' THEN 5 ELSE 0
|
||||||
|
END +
|
||||||
|
CASE goal_mode
|
||||||
|
WHEN 'weight_loss' THEN 5 ELSE 0
|
||||||
|
END AS flexibility_pct,
|
||||||
|
|
||||||
|
CASE goal_mode
|
||||||
|
WHEN 'health' THEN 50
|
||||||
|
WHEN 'endurance' THEN 20
|
||||||
|
WHEN 'strength' THEN 10
|
||||||
|
WHEN 'weight_loss' THEN 5
|
||||||
|
ELSE 0
|
||||||
|
END AS health_pct
|
||||||
|
FROM profiles
|
||||||
|
WHERE goal_mode IS NOT NULL
|
||||||
|
ON CONFLICT DO NOTHING;
|
||||||
|
|
||||||
|
-- For profiles without goal_mode, use balanced health focus
|
||||||
|
INSERT INTO focus_areas (
|
||||||
|
profile_id,
|
||||||
|
weight_loss_pct, muscle_gain_pct, strength_pct,
|
||||||
|
endurance_pct, flexibility_pct, health_pct
|
||||||
|
)
|
||||||
|
SELECT
|
||||||
|
id AS profile_id,
|
||||||
|
0, 0, 10, 20, 15, 55
|
||||||
|
FROM profiles
|
||||||
|
WHERE goal_mode IS NULL
|
||||||
|
AND id NOT IN (SELECT profile_id FROM focus_areas WHERE active = true)
|
||||||
|
ON CONFLICT DO NOTHING;
|
||||||
57
backend/migrations/028_goal_categories_priorities.sql
Normal file
57
backend/migrations/028_goal_categories_priorities.sql
Normal file
|
|
@ -0,0 +1,57 @@
|
||||||
|
-- Migration 028: Goal Categories and Priorities
|
||||||
|
-- Date: 2026-03-27
|
||||||
|
-- Purpose: Multi-dimensional goal priorities (one primary goal per category)
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Add category and priority columns
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
ALTER TABLE goals
|
||||||
|
ADD COLUMN category VARCHAR(50),
|
||||||
|
ADD COLUMN priority INTEGER DEFAULT 2 CHECK (priority >= 1 AND priority <= 3);
|
||||||
|
|
||||||
|
COMMENT ON COLUMN goals.category IS 'Goal category: body, training, nutrition, recovery, health, other';
|
||||||
|
COMMENT ON COLUMN goals.priority IS 'Priority level: 1=high, 2=medium, 3=low';
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Migrate existing goals to categories based on goal_type
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
UPDATE goals SET category = CASE
|
||||||
|
-- Body composition goals
|
||||||
|
WHEN goal_type IN ('weight', 'body_fat', 'lean_mass') THEN 'body'
|
||||||
|
|
||||||
|
-- Training goals
|
||||||
|
WHEN goal_type IN ('strength', 'flexibility', 'training_frequency') THEN 'training'
|
||||||
|
|
||||||
|
-- Health/cardio goals
|
||||||
|
WHEN goal_type IN ('vo2max', 'rhr', 'bp', 'hrv') THEN 'health'
|
||||||
|
|
||||||
|
-- Recovery goals
|
||||||
|
WHEN goal_type IN ('sleep_quality', 'sleep_duration', 'rest_days') THEN 'recovery'
|
||||||
|
|
||||||
|
-- Nutrition goals
|
||||||
|
WHEN goal_type IN ('calories', 'protein', 'healthy_eating') THEN 'nutrition'
|
||||||
|
|
||||||
|
-- Default
|
||||||
|
ELSE 'other'
|
||||||
|
END
|
||||||
|
WHERE category IS NULL;
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Set priority based on is_primary
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
UPDATE goals SET priority = CASE
|
||||||
|
WHEN is_primary = true THEN 1 -- Primary goals get priority 1
|
||||||
|
ELSE 2 -- Others get priority 2 (medium)
|
||||||
|
END;
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Create index for category-based queries
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_goals_category_priority
|
||||||
|
ON goals(profile_id, category, priority);
|
||||||
|
|
||||||
|
COMMENT ON INDEX idx_goals_category_priority IS 'Fast lookup for category-grouped goals sorted by priority';
|
||||||
74
backend/migrations/029_fix_missing_goal_types.sql
Normal file
74
backend/migrations/029_fix_missing_goal_types.sql
Normal file
|
|
@ -0,0 +1,74 @@
|
||||||
|
-- Migration 029: Fix Missing Goal Types (flexibility, strength)
|
||||||
|
-- Date: 2026-03-27
|
||||||
|
-- Purpose: Ensure flexibility and strength goal types are active and properly configured
|
||||||
|
|
||||||
|
-- These types were created earlier but are inactive or misconfigured
|
||||||
|
-- This migration fixes them without breaking if they don't exist
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Upsert flexibility goal type
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, label_en, unit, icon, category,
|
||||||
|
source_table, source_column, aggregation_method,
|
||||||
|
calculation_formula, filter_conditions, description, is_active
|
||||||
|
) VALUES (
|
||||||
|
'flexibility',
|
||||||
|
'Beweglichkeit',
|
||||||
|
'Flexibility',
|
||||||
|
'cm',
|
||||||
|
'🤸',
|
||||||
|
'training',
|
||||||
|
NULL, -- No automatic data source
|
||||||
|
NULL,
|
||||||
|
'latest',
|
||||||
|
NULL,
|
||||||
|
NULL,
|
||||||
|
'Beweglichkeit und Mobilität - manuelle Erfassung',
|
||||||
|
true
|
||||||
|
)
|
||||||
|
ON CONFLICT (type_key)
|
||||||
|
DO UPDATE SET
|
||||||
|
label_de = 'Beweglichkeit',
|
||||||
|
label_en = 'Flexibility',
|
||||||
|
unit = 'cm',
|
||||||
|
icon = '🤸',
|
||||||
|
category = 'training',
|
||||||
|
is_active = true,
|
||||||
|
description = 'Beweglichkeit und Mobilität - manuelle Erfassung';
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Upsert strength goal type
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
INSERT INTO goal_type_definitions (
|
||||||
|
type_key, label_de, label_en, unit, icon, category,
|
||||||
|
source_table, source_column, aggregation_method,
|
||||||
|
calculation_formula, filter_conditions, description, is_active
|
||||||
|
) VALUES (
|
||||||
|
'strength',
|
||||||
|
'Kraftniveau',
|
||||||
|
'Strength',
|
||||||
|
'Punkte',
|
||||||
|
'💪',
|
||||||
|
'training',
|
||||||
|
NULL, -- No automatic data source
|
||||||
|
NULL,
|
||||||
|
'latest',
|
||||||
|
NULL,
|
||||||
|
NULL,
|
||||||
|
'Allgemeines Kraftniveau - manuelle Erfassung',
|
||||||
|
true
|
||||||
|
)
|
||||||
|
ON CONFLICT (type_key)
|
||||||
|
DO UPDATE SET
|
||||||
|
label_de = 'Kraftniveau',
|
||||||
|
label_en = 'Strength',
|
||||||
|
unit = 'Punkte',
|
||||||
|
icon = '💪',
|
||||||
|
category = 'training',
|
||||||
|
is_active = true,
|
||||||
|
description = 'Allgemeines Kraftniveau - manuelle Erfassung';
|
||||||
|
|
||||||
|
COMMENT ON TABLE goal_type_definitions IS 'Goal type registry - defines all available goal types (v1.5: DB-driven, flexible system)';
|
||||||
64
backend/migrations/030_goal_progress_log.sql
Normal file
64
backend/migrations/030_goal_progress_log.sql
Normal file
|
|
@ -0,0 +1,64 @@
|
||||||
|
-- Migration 030: Goal Progress Log
|
||||||
|
-- Date: 2026-03-27
|
||||||
|
-- Purpose: Track progress history for all goals (especially custom goals without data source)
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Goal Progress Log Table
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS goal_progress_log (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
goal_id UUID NOT NULL REFERENCES goals(id) ON DELETE CASCADE,
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
|
||||||
|
-- Progress data
|
||||||
|
date DATE NOT NULL,
|
||||||
|
value DECIMAL(10,2) NOT NULL,
|
||||||
|
note TEXT,
|
||||||
|
|
||||||
|
-- Metadata
|
||||||
|
source VARCHAR(20) DEFAULT 'manual' CHECK (source IN ('manual', 'automatic', 'import')),
|
||||||
|
created_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
updated_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
|
||||||
|
-- Constraints
|
||||||
|
CONSTRAINT unique_progress_per_day UNIQUE(goal_id, date)
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_goal_progress_goal_date ON goal_progress_log(goal_id, date DESC);
|
||||||
|
CREATE INDEX idx_goal_progress_profile ON goal_progress_log(profile_id);
|
||||||
|
|
||||||
|
COMMENT ON TABLE goal_progress_log IS 'Progress history for goals - enables manual tracking for custom goals and charts';
|
||||||
|
COMMENT ON COLUMN goal_progress_log.value IS 'Progress value in goal unit (e.g., kg, cm, points)';
|
||||||
|
COMMENT ON COLUMN goal_progress_log.source IS 'manual: user entered, automatic: computed from data source, import: CSV/API';
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Function: Update goal current_value from latest progress
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
CREATE OR REPLACE FUNCTION update_goal_current_value()
|
||||||
|
RETURNS TRIGGER AS $$
|
||||||
|
BEGIN
|
||||||
|
-- Update current_value in goals table with latest progress entry
|
||||||
|
UPDATE goals
|
||||||
|
SET current_value = (
|
||||||
|
SELECT value
|
||||||
|
FROM goal_progress_log
|
||||||
|
WHERE goal_id = NEW.goal_id
|
||||||
|
ORDER BY date DESC
|
||||||
|
LIMIT 1
|
||||||
|
),
|
||||||
|
updated_at = NOW()
|
||||||
|
WHERE id = NEW.goal_id;
|
||||||
|
|
||||||
|
RETURN NEW;
|
||||||
|
END;
|
||||||
|
$$ LANGUAGE plpgsql;
|
||||||
|
|
||||||
|
-- Trigger: Auto-update current_value when progress is added/updated
|
||||||
|
CREATE TRIGGER trigger_update_goal_current_value
|
||||||
|
AFTER INSERT OR UPDATE ON goal_progress_log
|
||||||
|
FOR EACH ROW
|
||||||
|
EXECUTE FUNCTION update_goal_current_value();
|
||||||
|
|
||||||
|
COMMENT ON FUNCTION update_goal_current_value IS 'Auto-update goal.current_value when new progress is logged';
|
||||||
254
backend/migrations/031_focus_area_system_v2.sql
Normal file
254
backend/migrations/031_focus_area_system_v2.sql
Normal file
|
|
@ -0,0 +1,254 @@
|
||||||
|
-- Migration 031: Focus Area System v2.0
|
||||||
|
-- Date: 2026-03-27
|
||||||
|
-- Purpose: Dynamic, extensible focus areas with Many-to-Many goal contributions
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Part 1: New Tables
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- Focus Area Definitions (dynamic, user-extensible)
|
||||||
|
CREATE TABLE IF NOT EXISTS focus_area_definitions (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
key VARCHAR(50) UNIQUE NOT NULL, -- e.g. 'strength', 'aerobic_endurance'
|
||||||
|
name_de VARCHAR(100) NOT NULL,
|
||||||
|
name_en VARCHAR(100),
|
||||||
|
icon VARCHAR(10),
|
||||||
|
description TEXT,
|
||||||
|
category VARCHAR(50), -- 'body_composition', 'training', 'endurance', 'coordination', 'mental', 'recovery', 'health'
|
||||||
|
is_active BOOLEAN DEFAULT true,
|
||||||
|
created_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
updated_at TIMESTAMP DEFAULT NOW()
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_focus_area_key ON focus_area_definitions(key);
|
||||||
|
CREATE INDEX idx_focus_area_category ON focus_area_definitions(category);
|
||||||
|
|
||||||
|
COMMENT ON TABLE focus_area_definitions IS 'Dynamic focus area registry - defines all available focus dimensions';
|
||||||
|
COMMENT ON COLUMN focus_area_definitions.key IS 'Unique identifier for programmatic access';
|
||||||
|
COMMENT ON COLUMN focus_area_definitions.category IS 'Grouping for UI display';
|
||||||
|
|
||||||
|
-- Many-to-Many: Goals contribute to Focus Areas
|
||||||
|
CREATE TABLE IF NOT EXISTS goal_focus_contributions (
|
||||||
|
goal_id UUID NOT NULL REFERENCES goals(id) ON DELETE CASCADE,
|
||||||
|
focus_area_id UUID NOT NULL REFERENCES focus_area_definitions(id) ON DELETE CASCADE,
|
||||||
|
contribution_weight DECIMAL(5,2) DEFAULT 100.00 CHECK (contribution_weight >= 0 AND contribution_weight <= 100),
|
||||||
|
created_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
PRIMARY KEY (goal_id, focus_area_id)
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_gfc_goal ON goal_focus_contributions(goal_id);
|
||||||
|
CREATE INDEX idx_gfc_focus_area ON goal_focus_contributions(focus_area_id);
|
||||||
|
|
||||||
|
COMMENT ON TABLE goal_focus_contributions IS 'Maps goals to focus areas with contribution weights (0-100%)';
|
||||||
|
COMMENT ON COLUMN goal_focus_contributions.contribution_weight IS 'How much this goal contributes to the focus area (0-100%)';
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Part 2: Rename existing focus_areas table
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- Old focus_areas table becomes user_focus_preferences
|
||||||
|
ALTER TABLE focus_areas RENAME TO user_focus_preferences;
|
||||||
|
|
||||||
|
-- Add reference to new focus_area_definitions (for future use)
|
||||||
|
ALTER TABLE user_focus_preferences ADD COLUMN IF NOT EXISTS notes TEXT;
|
||||||
|
|
||||||
|
COMMENT ON TABLE user_focus_preferences IS 'User-specific focus area weightings (legacy flat structure + new references)';
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Part 3: Seed Data - Basis Focus Areas
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
INSERT INTO focus_area_definitions (key, name_de, name_en, icon, category, description) VALUES
|
||||||
|
-- Body Composition
|
||||||
|
('weight_loss', 'Gewichtsverlust', 'Weight Loss', '📉', 'body_composition', 'Körpergewicht reduzieren'),
|
||||||
|
('muscle_gain', 'Muskelaufbau', 'Muscle Gain', '💪', 'body_composition', 'Muskelmasse aufbauen'),
|
||||||
|
('body_recomposition', 'Body Recomposition', 'Body Recomposition', '⚖️', 'body_composition', 'Gleichzeitig Fett abbauen und Muskeln aufbauen'),
|
||||||
|
|
||||||
|
-- Training - Kraft
|
||||||
|
('strength', 'Maximalkraft', 'Strength', '🏋️', 'training', 'Maximale Kraftfähigkeit'),
|
||||||
|
('strength_endurance', 'Kraftausdauer', 'Strength Endurance', '💪🏃', 'training', 'Kraft über längere Zeit aufrechterhalten'),
|
||||||
|
('power', 'Schnellkraft', 'Power', '⚡', 'training', 'Kraft in kurzer Zeit entfalten'),
|
||||||
|
|
||||||
|
-- Training - Beweglichkeit
|
||||||
|
('flexibility', 'Beweglichkeit', 'Flexibility', '🤸', 'training', 'Gelenkigkeit und Bewegungsumfang'),
|
||||||
|
('mobility', 'Mobilität', 'Mobility', '🦴', 'training', 'Aktive Beweglichkeit und Kontrolle'),
|
||||||
|
|
||||||
|
-- Ausdauer
|
||||||
|
('aerobic_endurance', 'Aerobe Ausdauer', 'Aerobic Endurance', '🫁', 'endurance', 'VO2Max, lange moderate Belastung'),
|
||||||
|
('anaerobic_endurance', 'Anaerobe Ausdauer', 'Anaerobic Endurance', '⚡', 'endurance', 'Laktattoleranz, kurze intensive Belastung'),
|
||||||
|
('cardiovascular_health', 'Herz-Kreislauf', 'Cardiovascular Health', '❤️', 'endurance', 'Herzgesundheit und Ausdauer'),
|
||||||
|
|
||||||
|
-- Koordination
|
||||||
|
('balance', 'Gleichgewicht', 'Balance', '⚖️', 'coordination', 'Statisches und dynamisches Gleichgewicht'),
|
||||||
|
('reaction', 'Reaktionsfähigkeit', 'Reaction', '⚡', 'coordination', 'Schnelligkeit der Reaktion auf Reize'),
|
||||||
|
('rhythm', 'Rhythmusgefühl', 'Rhythm', '🎵', 'coordination', 'Zeitliche Abstimmung von Bewegungen'),
|
||||||
|
('coordination', 'Koordination', 'Coordination', '🎯', 'coordination', 'Zusammenspiel verschiedener Bewegungen'),
|
||||||
|
|
||||||
|
-- Mental
|
||||||
|
('stress_resistance', 'Stressresistenz', 'Stress Resistance', '🧘', 'mental', 'Umgang mit mentalem und physischem Stress'),
|
||||||
|
('concentration', 'Konzentration', 'Concentration', '🎯', 'mental', 'Fokussierung und Aufmerksamkeit'),
|
||||||
|
('willpower', 'Willenskraft', 'Willpower', '💎', 'mental', 'Durchhaltevermögen und Selbstdisziplin'),
|
||||||
|
('mental_health', 'Mentale Gesundheit', 'Mental Health', '🧠', 'mental', 'Psychisches Wohlbefinden'),
|
||||||
|
|
||||||
|
-- Recovery
|
||||||
|
('sleep_quality', 'Schlafqualität', 'Sleep Quality', '😴', 'recovery', 'Erholsamer Schlaf'),
|
||||||
|
('regeneration', 'Regeneration', 'Regeneration', '♻️', 'recovery', 'Körperliche Erholung'),
|
||||||
|
('rest', 'Ruhe', 'Rest', '🛌', 'recovery', 'Aktive und passive Erholung'),
|
||||||
|
|
||||||
|
-- Health
|
||||||
|
('metabolic_health', 'Stoffwechselgesundheit', 'Metabolic Health', '🔥', 'health', 'Blutzucker, Insulin, Stoffwechsel'),
|
||||||
|
('blood_pressure', 'Blutdruck', 'Blood Pressure', '❤️🩹', 'health', 'Gesunder Blutdruck'),
|
||||||
|
('hrv', 'Herzratenvariabilität', 'HRV', '💓', 'health', 'Autonomes Nervensystem'),
|
||||||
|
('general_health', 'Allgemeine Gesundheit', 'General Health', '🏥', 'health', 'Vitale Gesundheit und Wohlbefinden')
|
||||||
|
ON CONFLICT (key) DO NOTHING;
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Part 4: Auto-Mapping - Bestehende Goals zu Focus Areas
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- Helper function to get focus_area_id by key
|
||||||
|
CREATE OR REPLACE FUNCTION get_focus_area_id(area_key VARCHAR)
|
||||||
|
RETURNS UUID AS $$
|
||||||
|
BEGIN
|
||||||
|
RETURN (SELECT id FROM focus_area_definitions WHERE key = area_key LIMIT 1);
|
||||||
|
END;
|
||||||
|
$$ LANGUAGE plpgsql;
|
||||||
|
|
||||||
|
-- Weight goals → weight_loss (100%)
|
||||||
|
INSERT INTO goal_focus_contributions (goal_id, focus_area_id, contribution_weight)
|
||||||
|
SELECT g.id, get_focus_area_id('weight_loss'), 100.00
|
||||||
|
FROM goals g
|
||||||
|
WHERE g.goal_type = 'weight'
|
||||||
|
ON CONFLICT (goal_id, focus_area_id) DO NOTHING;
|
||||||
|
|
||||||
|
-- Body Fat goals → weight_loss (60%) + body_recomposition (40%)
|
||||||
|
INSERT INTO goal_focus_contributions (goal_id, focus_area_id, contribution_weight)
|
||||||
|
SELECT g.id, fa.id,
|
||||||
|
CASE fa.key
|
||||||
|
WHEN 'weight_loss' THEN 60.00
|
||||||
|
WHEN 'body_recomposition' THEN 40.00
|
||||||
|
END
|
||||||
|
FROM goals g
|
||||||
|
CROSS JOIN focus_area_definitions fa
|
||||||
|
WHERE g.goal_type = 'body_fat'
|
||||||
|
AND fa.key IN ('weight_loss', 'body_recomposition')
|
||||||
|
ON CONFLICT (goal_id, focus_area_id) DO NOTHING;
|
||||||
|
|
||||||
|
-- Lean Mass goals → muscle_gain (70%) + body_recomposition (30%)
|
||||||
|
INSERT INTO goal_focus_contributions (goal_id, focus_area_id, contribution_weight)
|
||||||
|
SELECT g.id, fa.id,
|
||||||
|
CASE fa.key
|
||||||
|
WHEN 'muscle_gain' THEN 70.00
|
||||||
|
WHEN 'body_recomposition' THEN 30.00
|
||||||
|
END
|
||||||
|
FROM goals g
|
||||||
|
CROSS JOIN focus_area_definitions fa
|
||||||
|
WHERE g.goal_type = 'lean_mass'
|
||||||
|
AND fa.key IN ('muscle_gain', 'body_recomposition')
|
||||||
|
ON CONFLICT (goal_id, focus_area_id) DO NOTHING;
|
||||||
|
|
||||||
|
-- Strength goals → strength (70%) + muscle_gain (30%)
|
||||||
|
INSERT INTO goal_focus_contributions (goal_id, focus_area_id, contribution_weight)
|
||||||
|
SELECT g.id, fa.id,
|
||||||
|
CASE fa.key
|
||||||
|
WHEN 'strength' THEN 70.00
|
||||||
|
WHEN 'muscle_gain' THEN 30.00
|
||||||
|
END
|
||||||
|
FROM goals g
|
||||||
|
CROSS JOIN focus_area_definitions fa
|
||||||
|
WHERE g.goal_type = 'strength'
|
||||||
|
AND fa.key IN ('strength', 'muscle_gain')
|
||||||
|
ON CONFLICT (goal_id, focus_area_id) DO NOTHING;
|
||||||
|
|
||||||
|
-- Flexibility goals → flexibility (100%)
|
||||||
|
INSERT INTO goal_focus_contributions (goal_id, focus_area_id, contribution_weight)
|
||||||
|
SELECT g.id, get_focus_area_id('flexibility'), 100.00
|
||||||
|
FROM goals g
|
||||||
|
WHERE g.goal_type = 'flexibility'
|
||||||
|
ON CONFLICT (goal_id, focus_area_id) DO NOTHING;
|
||||||
|
|
||||||
|
-- VO2Max goals → aerobic_endurance (80%) + cardiovascular_health (20%)
|
||||||
|
INSERT INTO goal_focus_contributions (goal_id, focus_area_id, contribution_weight)
|
||||||
|
SELECT g.id, fa.id,
|
||||||
|
CASE fa.key
|
||||||
|
WHEN 'aerobic_endurance' THEN 80.00
|
||||||
|
WHEN 'cardiovascular_health' THEN 20.00
|
||||||
|
END
|
||||||
|
FROM goals g
|
||||||
|
CROSS JOIN focus_area_definitions fa
|
||||||
|
WHERE g.goal_type = 'vo2max'
|
||||||
|
AND fa.key IN ('aerobic_endurance', 'cardiovascular_health')
|
||||||
|
ON CONFLICT (goal_id, focus_area_id) DO NOTHING;
|
||||||
|
|
||||||
|
-- Resting Heart Rate goals → cardiovascular_health (100%)
|
||||||
|
INSERT INTO goal_focus_contributions (goal_id, focus_area_id, contribution_weight)
|
||||||
|
SELECT g.id, get_focus_area_id('cardiovascular_health'), 100.00
|
||||||
|
FROM goals g
|
||||||
|
WHERE g.goal_type = 'rhr'
|
||||||
|
ON CONFLICT (goal_id, focus_area_id) DO NOTHING;
|
||||||
|
|
||||||
|
-- Blood Pressure goals → blood_pressure (80%) + cardiovascular_health (20%)
|
||||||
|
INSERT INTO goal_focus_contributions (goal_id, focus_area_id, contribution_weight)
|
||||||
|
SELECT g.id, fa.id,
|
||||||
|
CASE fa.key
|
||||||
|
WHEN 'blood_pressure' THEN 80.00
|
||||||
|
WHEN 'cardiovascular_health' THEN 20.00
|
||||||
|
END
|
||||||
|
FROM goals g
|
||||||
|
CROSS JOIN focus_area_definitions fa
|
||||||
|
WHERE g.goal_type = 'bp'
|
||||||
|
AND fa.key IN ('blood_pressure', 'cardiovascular_health')
|
||||||
|
ON CONFLICT (goal_id, focus_area_id) DO NOTHING;
|
||||||
|
|
||||||
|
-- HRV goals → hrv (70%) + stress_resistance (30%)
|
||||||
|
INSERT INTO goal_focus_contributions (goal_id, focus_area_id, contribution_weight)
|
||||||
|
SELECT g.id, fa.id,
|
||||||
|
CASE fa.key
|
||||||
|
WHEN 'hrv' THEN 70.00
|
||||||
|
WHEN 'stress_resistance' THEN 30.00
|
||||||
|
END
|
||||||
|
FROM goals g
|
||||||
|
CROSS JOIN focus_area_definitions fa
|
||||||
|
WHERE g.goal_type = 'hrv'
|
||||||
|
AND fa.key IN ('hrv', 'stress_resistance')
|
||||||
|
ON CONFLICT (goal_id, focus_area_id) DO NOTHING;
|
||||||
|
|
||||||
|
-- Sleep Quality goals → sleep_quality (100%)
|
||||||
|
INSERT INTO goal_focus_contributions (goal_id, focus_area_id, contribution_weight)
|
||||||
|
SELECT g.id, get_focus_area_id('sleep_quality'), 100.00
|
||||||
|
FROM goals g
|
||||||
|
WHERE g.goal_type = 'sleep_quality'
|
||||||
|
ON CONFLICT (goal_id, focus_area_id) DO NOTHING;
|
||||||
|
|
||||||
|
-- Training Frequency goals → general catch-all (strength + endurance)
|
||||||
|
INSERT INTO goal_focus_contributions (goal_id, focus_area_id, contribution_weight)
|
||||||
|
SELECT g.id, fa.id,
|
||||||
|
CASE fa.key
|
||||||
|
WHEN 'strength' THEN 40.00
|
||||||
|
WHEN 'aerobic_endurance' THEN 40.00
|
||||||
|
WHEN 'general_health' THEN 20.00
|
||||||
|
END
|
||||||
|
FROM goals g
|
||||||
|
CROSS JOIN focus_area_definitions fa
|
||||||
|
WHERE g.goal_type = 'training_frequency'
|
||||||
|
AND fa.key IN ('strength', 'aerobic_endurance', 'general_health')
|
||||||
|
ON CONFLICT (goal_id, focus_area_id) DO NOTHING;
|
||||||
|
|
||||||
|
-- Cleanup helper function
|
||||||
|
DROP FUNCTION IF EXISTS get_focus_area_id(VARCHAR);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Summary
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
COMMENT ON TABLE focus_area_definitions IS
|
||||||
|
'v2.0: Dynamic focus areas - replaces hardcoded 6-dimension system.
|
||||||
|
26 base areas across 7 categories. User-extensible via admin UI.';
|
||||||
|
|
||||||
|
COMMENT ON TABLE goal_focus_contributions IS
|
||||||
|
'Many-to-Many mapping: Goals contribute to multiple focus areas with weights.
|
||||||
|
Auto-mapped from goal_type, editable by user.';
|
||||||
|
|
||||||
|
COMMENT ON TABLE user_focus_preferences IS
|
||||||
|
'Legacy flat structure (weight_loss_pct, muscle_gain_pct, etc.) remains for backward compatibility.
|
||||||
|
Future: Use focus_area_definitions + dynamic preferences.';
|
||||||
53
backend/migrations/032_user_focus_area_weights.sql
Normal file
53
backend/migrations/032_user_focus_area_weights.sql
Normal file
|
|
@ -0,0 +1,53 @@
|
||||||
|
-- Migration 032: User Focus Area Weights
|
||||||
|
-- Date: 2026-03-27
|
||||||
|
-- Purpose: Allow users to set custom weights for focus areas (dynamic preferences)
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- User Focus Area Weights (many-to-many with weights)
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS user_focus_area_weights (
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
focus_area_id UUID NOT NULL REFERENCES focus_area_definitions(id) ON DELETE CASCADE,
|
||||||
|
weight INTEGER NOT NULL DEFAULT 0 CHECK (weight >= 0 AND weight <= 100),
|
||||||
|
created_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
updated_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
PRIMARY KEY (profile_id, focus_area_id)
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_user_focus_weights_profile ON user_focus_area_weights(profile_id);
|
||||||
|
CREATE INDEX idx_user_focus_weights_area ON user_focus_area_weights(focus_area_id);
|
||||||
|
|
||||||
|
COMMENT ON TABLE user_focus_area_weights IS 'User-specific weights for focus areas (dynamic system)';
|
||||||
|
COMMENT ON COLUMN user_focus_area_weights.weight IS 'Relative weight (0-100) - will be normalized to percentages in UI';
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Migrate legacy preferences to dynamic weights
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- For each user with legacy preferences, create weights for the 6 base areas
|
||||||
|
INSERT INTO user_focus_area_weights (profile_id, focus_area_id, weight)
|
||||||
|
SELECT
|
||||||
|
ufp.profile_id,
|
||||||
|
fad.id as focus_area_id,
|
||||||
|
CASE fad.key
|
||||||
|
WHEN 'weight_loss' THEN ufp.weight_loss_pct
|
||||||
|
WHEN 'muscle_gain' THEN ufp.muscle_gain_pct
|
||||||
|
WHEN 'strength' THEN ufp.strength_pct
|
||||||
|
WHEN 'aerobic_endurance' THEN ufp.endurance_pct
|
||||||
|
WHEN 'flexibility' THEN ufp.flexibility_pct
|
||||||
|
WHEN 'general_health' THEN ufp.health_pct
|
||||||
|
ELSE 0
|
||||||
|
END as weight
|
||||||
|
FROM user_focus_preferences ufp
|
||||||
|
CROSS JOIN focus_area_definitions fad
|
||||||
|
WHERE fad.key IN ('weight_loss', 'muscle_gain', 'strength', 'aerobic_endurance', 'flexibility', 'general_health')
|
||||||
|
AND (
|
||||||
|
(fad.key = 'weight_loss' AND ufp.weight_loss_pct > 0) OR
|
||||||
|
(fad.key = 'muscle_gain' AND ufp.muscle_gain_pct > 0) OR
|
||||||
|
(fad.key = 'strength' AND ufp.strength_pct > 0) OR
|
||||||
|
(fad.key = 'aerobic_endurance' AND ufp.endurance_pct > 0) OR
|
||||||
|
(fad.key = 'flexibility' AND ufp.flexibility_pct > 0) OR
|
||||||
|
(fad.key = 'general_health' AND ufp.health_pct > 0)
|
||||||
|
)
|
||||||
|
ON CONFLICT (profile_id, focus_area_id) DO NOTHING;
|
||||||
50
backend/migrations/check_features.sql
Normal file
50
backend/migrations/check_features.sql
Normal file
|
|
@ -0,0 +1,50 @@
|
||||||
|
-- ============================================================================
|
||||||
|
-- Feature Check Script - Diagnose vor/nach Migration
|
||||||
|
-- ============================================================================
|
||||||
|
-- Usage: psql -U mitai_dev -d mitai_dev -f check_features.sql
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
\echo '=== CURRENT FEATURES ==='
|
||||||
|
SELECT id, name, category, limit_type, reset_period, default_limit, active
|
||||||
|
FROM features
|
||||||
|
ORDER BY category, id;
|
||||||
|
|
||||||
|
\echo ''
|
||||||
|
\echo '=== TIER LIMITS MATRIX ==='
|
||||||
|
SELECT
|
||||||
|
f.id as feature,
|
||||||
|
f.category,
|
||||||
|
MAX(CASE WHEN tl.tier_id = 'free' THEN COALESCE(tl.limit_value::text, '∞') END) as free,
|
||||||
|
MAX(CASE WHEN tl.tier_id = 'basic' THEN COALESCE(tl.limit_value::text, '∞') END) as basic,
|
||||||
|
MAX(CASE WHEN tl.tier_id = 'premium' THEN COALESCE(tl.limit_value::text, '∞') END) as premium,
|
||||||
|
MAX(CASE WHEN tl.tier_id = 'selfhosted' THEN COALESCE(tl.limit_value::text, '∞') END) as selfhosted
|
||||||
|
FROM features f
|
||||||
|
LEFT JOIN tier_limits tl ON f.id = tl.feature_id
|
||||||
|
GROUP BY f.id, f.category
|
||||||
|
ORDER BY f.category, f.id;
|
||||||
|
|
||||||
|
\echo ''
|
||||||
|
\echo '=== FEATURE COUNT BY CATEGORY ==='
|
||||||
|
SELECT category, COUNT(*) as count
|
||||||
|
FROM features
|
||||||
|
WHERE active = true
|
||||||
|
GROUP BY category
|
||||||
|
ORDER BY category;
|
||||||
|
|
||||||
|
\echo ''
|
||||||
|
\echo '=== ORPHANED TIER LIMITS (feature not exists) ==='
|
||||||
|
SELECT tl.tier_id, tl.feature_id, tl.limit_value
|
||||||
|
FROM tier_limits tl
|
||||||
|
LEFT JOIN features f ON tl.feature_id = f.id
|
||||||
|
WHERE f.id IS NULL;
|
||||||
|
|
||||||
|
\echo ''
|
||||||
|
\echo '=== USER FEATURE USAGE (current usage tracking) ==='
|
||||||
|
SELECT
|
||||||
|
p.name as user,
|
||||||
|
ufu.feature_id,
|
||||||
|
ufu.usage_count,
|
||||||
|
ufu.reset_at
|
||||||
|
FROM user_feature_usage ufu
|
||||||
|
JOIN profiles p ON ufu.profile_id = p.id
|
||||||
|
ORDER BY p.name, ufu.feature_id;
|
||||||
141
backend/migrations/v9c_cleanup_features.sql
Normal file
141
backend/migrations/v9c_cleanup_features.sql
Normal file
|
|
@ -0,0 +1,141 @@
|
||||||
|
-- ============================================================================
|
||||||
|
-- v9c Cleanup: Feature-Konsolidierung
|
||||||
|
-- ============================================================================
|
||||||
|
-- Created: 2026-03-20
|
||||||
|
-- Purpose: Konsolidiere Export-Features (export_csv/json/zip → data_export)
|
||||||
|
-- und Import-Features (csv_import → data_import)
|
||||||
|
--
|
||||||
|
-- Idempotent: Kann mehrfach ausgeführt werden
|
||||||
|
--
|
||||||
|
-- Lessons Learned:
|
||||||
|
-- "Ein Feature für Export, nicht drei (csv/json/zip)"
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- 1. Rename csv_import to data_import
|
||||||
|
-- ============================================================================
|
||||||
|
UPDATE features
|
||||||
|
SET
|
||||||
|
id = 'data_import',
|
||||||
|
name = 'Daten importieren',
|
||||||
|
description = 'CSV-Import (FDDB, Apple Health) + ZIP-Backup-Import'
|
||||||
|
WHERE id = 'csv_import';
|
||||||
|
|
||||||
|
-- Update tier_limits references
|
||||||
|
UPDATE tier_limits
|
||||||
|
SET feature_id = 'data_import'
|
||||||
|
WHERE feature_id = 'csv_import';
|
||||||
|
|
||||||
|
-- Update user_feature_restrictions references
|
||||||
|
UPDATE user_feature_restrictions
|
||||||
|
SET feature_id = 'data_import'
|
||||||
|
WHERE feature_id = 'csv_import';
|
||||||
|
|
||||||
|
-- Update user_feature_usage references
|
||||||
|
UPDATE user_feature_usage
|
||||||
|
SET feature_id = 'data_import'
|
||||||
|
WHERE feature_id = 'csv_import';
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- 2. Remove old export_csv/json/zip features
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- Remove tier_limits for old features
|
||||||
|
DELETE FROM tier_limits
|
||||||
|
WHERE feature_id IN ('export_csv', 'export_json', 'export_zip');
|
||||||
|
|
||||||
|
-- Remove user restrictions for old features
|
||||||
|
DELETE FROM user_feature_restrictions
|
||||||
|
WHERE feature_id IN ('export_csv', 'export_json', 'export_zip');
|
||||||
|
|
||||||
|
-- Remove usage tracking for old features
|
||||||
|
DELETE FROM user_feature_usage
|
||||||
|
WHERE feature_id IN ('export_csv', 'export_json', 'export_zip');
|
||||||
|
|
||||||
|
-- Remove old feature definitions
|
||||||
|
DELETE FROM features
|
||||||
|
WHERE id IN ('export_csv', 'export_json', 'export_zip');
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- 3. Ensure data_export exists and is properly configured
|
||||||
|
-- ============================================================================
|
||||||
|
INSERT INTO features (id, name, description, category, limit_type, reset_period, default_limit, active)
|
||||||
|
VALUES ('data_export', 'Daten exportieren', 'CSV/JSON/ZIP Export', 'export', 'count', 'monthly', 0, true)
|
||||||
|
ON CONFLICT (id) DO UPDATE SET
|
||||||
|
name = EXCLUDED.name,
|
||||||
|
description = EXCLUDED.description,
|
||||||
|
category = EXCLUDED.category,
|
||||||
|
limit_type = EXCLUDED.limit_type,
|
||||||
|
reset_period = EXCLUDED.reset_period;
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- 4. Ensure data_import exists and is properly configured
|
||||||
|
-- ============================================================================
|
||||||
|
INSERT INTO features (id, name, description, category, limit_type, reset_period, default_limit, active)
|
||||||
|
VALUES ('data_import', 'Daten importieren', 'CSV-Import (FDDB, Apple Health) + ZIP-Backup-Import', 'import', 'count', 'monthly', 0, true)
|
||||||
|
ON CONFLICT (id) DO UPDATE SET
|
||||||
|
name = EXCLUDED.name,
|
||||||
|
description = EXCLUDED.description,
|
||||||
|
category = EXCLUDED.category,
|
||||||
|
limit_type = EXCLUDED.limit_type,
|
||||||
|
reset_period = EXCLUDED.reset_period;
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- 5. Update tier_limits for data_export (consolidate from old features)
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- FREE tier: no export
|
||||||
|
INSERT INTO tier_limits (tier_id, feature_id, limit_value)
|
||||||
|
VALUES ('free', 'data_export', 0)
|
||||||
|
ON CONFLICT (tier_id, feature_id) DO UPDATE SET limit_value = EXCLUDED.limit_value;
|
||||||
|
|
||||||
|
-- BASIC tier: 5 exports/month
|
||||||
|
INSERT INTO tier_limits (tier_id, feature_id, limit_value)
|
||||||
|
VALUES ('basic', 'data_export', 5)
|
||||||
|
ON CONFLICT (tier_id, feature_id) DO UPDATE SET limit_value = EXCLUDED.limit_value;
|
||||||
|
|
||||||
|
-- PREMIUM tier: unlimited
|
||||||
|
INSERT INTO tier_limits (tier_id, feature_id, limit_value)
|
||||||
|
VALUES ('premium', 'data_export', NULL)
|
||||||
|
ON CONFLICT (tier_id, feature_id) DO UPDATE SET limit_value = EXCLUDED.limit_value;
|
||||||
|
|
||||||
|
-- SELFHOSTED tier: unlimited
|
||||||
|
INSERT INTO tier_limits (tier_id, feature_id, limit_value)
|
||||||
|
VALUES ('selfhosted', 'data_export', NULL)
|
||||||
|
ON CONFLICT (tier_id, feature_id) DO UPDATE SET limit_value = EXCLUDED.limit_value;
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- 6. Update tier_limits for data_import
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- FREE tier: no import
|
||||||
|
INSERT INTO tier_limits (tier_id, feature_id, limit_value)
|
||||||
|
VALUES ('free', 'data_import', 0)
|
||||||
|
ON CONFLICT (tier_id, feature_id) DO UPDATE SET limit_value = EXCLUDED.limit_value;
|
||||||
|
|
||||||
|
-- BASIC tier: 3 imports/month
|
||||||
|
INSERT INTO tier_limits (tier_id, feature_id, limit_value)
|
||||||
|
VALUES ('basic', 'data_import', 3)
|
||||||
|
ON CONFLICT (tier_id, feature_id) DO UPDATE SET limit_value = EXCLUDED.limit_value;
|
||||||
|
|
||||||
|
-- PREMIUM tier: unlimited
|
||||||
|
INSERT INTO tier_limits (tier_id, feature_id, limit_value)
|
||||||
|
VALUES ('premium', 'data_import', NULL)
|
||||||
|
ON CONFLICT (tier_id, feature_id) DO UPDATE SET limit_value = EXCLUDED.limit_value;
|
||||||
|
|
||||||
|
-- SELFHOSTED tier: unlimited
|
||||||
|
INSERT INTO tier_limits (tier_id, feature_id, limit_value)
|
||||||
|
VALUES ('selfhosted', 'data_import', NULL)
|
||||||
|
ON CONFLICT (tier_id, feature_id) DO UPDATE SET limit_value = EXCLUDED.limit_value;
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Cleanup complete
|
||||||
|
-- ============================================================================
|
||||||
|
-- Final feature list:
|
||||||
|
-- Data: weight_entries, circumference_entries, caliper_entries,
|
||||||
|
-- nutrition_entries, activity_entries, photos
|
||||||
|
-- AI: ai_calls, ai_pipeline
|
||||||
|
-- Export/Import: data_export, data_import
|
||||||
|
--
|
||||||
|
-- Total: 10 features (down from 13)
|
||||||
|
-- ============================================================================
|
||||||
33
backend/migrations/v9c_fix_features.sql
Normal file
33
backend/migrations/v9c_fix_features.sql
Normal file
|
|
@ -0,0 +1,33 @@
|
||||||
|
-- Fix missing features for v9c feature enforcement
|
||||||
|
-- 2026-03-20
|
||||||
|
|
||||||
|
-- Add missing features
|
||||||
|
INSERT INTO features (id, name, description, category, limit_type, reset_period, default_limit, active) VALUES
|
||||||
|
('data_export', 'Daten exportieren', 'CSV/JSON/ZIP Export', 'export', 'count', 'monthly', 0, true),
|
||||||
|
('csv_import', 'CSV importieren', 'FDDB/Apple Health CSV Import + ZIP Backup Import', 'import', 'count', 'monthly', 0, true)
|
||||||
|
ON CONFLICT (id) DO NOTHING;
|
||||||
|
|
||||||
|
-- Add tier limits for new features
|
||||||
|
-- FREE tier
|
||||||
|
INSERT INTO tier_limits (tier_id, feature_id, limit_value) VALUES
|
||||||
|
('free', 'data_export', 0), -- Kein Export
|
||||||
|
('free', 'csv_import', 0) -- Kein Import
|
||||||
|
ON CONFLICT (tier_id, feature_id) DO NOTHING;
|
||||||
|
|
||||||
|
-- BASIC tier
|
||||||
|
INSERT INTO tier_limits (tier_id, feature_id, limit_value) VALUES
|
||||||
|
('basic', 'data_export', 5), -- 5 Exporte/Monat
|
||||||
|
('basic', 'csv_import', 3) -- 3 Imports/Monat
|
||||||
|
ON CONFLICT (tier_id, feature_id) DO NOTHING;
|
||||||
|
|
||||||
|
-- PREMIUM tier
|
||||||
|
INSERT INTO tier_limits (tier_id, feature_id, limit_value) VALUES
|
||||||
|
('premium', 'data_export', NULL), -- Unbegrenzt
|
||||||
|
('premium', 'csv_import', NULL) -- Unbegrenzt
|
||||||
|
ON CONFLICT (tier_id, feature_id) DO NOTHING;
|
||||||
|
|
||||||
|
-- SELFHOSTED tier
|
||||||
|
INSERT INTO tier_limits (tier_id, feature_id, limit_value) VALUES
|
||||||
|
('selfhosted', 'data_export', NULL), -- Unbegrenzt
|
||||||
|
('selfhosted', 'csv_import', NULL) -- Unbegrenzt
|
||||||
|
ON CONFLICT (tier_id, feature_id) DO NOTHING;
|
||||||
352
backend/migrations/v9c_subscription_system.sql
Normal file
352
backend/migrations/v9c_subscription_system.sql
Normal file
|
|
@ -0,0 +1,352 @@
|
||||||
|
-- ============================================================================
|
||||||
|
-- Mitai Jinkendo v9c: Subscription & Coupon System Migration
|
||||||
|
-- ============================================================================
|
||||||
|
-- Created: 2026-03-19
|
||||||
|
-- Purpose: Add flexible tier system with Feature-Registry Pattern
|
||||||
|
--
|
||||||
|
-- Tables added:
|
||||||
|
-- 1. app_settings - Global configuration
|
||||||
|
-- 2. tiers - Subscription tiers (simplified)
|
||||||
|
-- 3. features - Feature registry (all limitable features)
|
||||||
|
-- 4. tier_limits - Tier x Feature matrix
|
||||||
|
-- 5. user_feature_restrictions - Individual user overrides
|
||||||
|
-- 6. user_feature_usage - Usage tracking
|
||||||
|
-- 7. coupons - Coupon management
|
||||||
|
-- 8. coupon_redemptions - Redemption history
|
||||||
|
-- 9. access_grants - Time-limited access grants
|
||||||
|
-- 10. user_activity_log - Activity tracking
|
||||||
|
-- 11. user_stats - Aggregated statistics
|
||||||
|
--
|
||||||
|
-- Feature-Registry Pattern:
|
||||||
|
-- Instead of hardcoded columns (max_weight_entries, max_ai_calls),
|
||||||
|
-- all limits are defined in features table and configured via tier_limits.
|
||||||
|
-- This allows adding new limitable features without schema changes.
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- 1. app_settings - Global configuration
|
||||||
|
-- ============================================================================
|
||||||
|
CREATE TABLE IF NOT EXISTS app_settings (
|
||||||
|
key TEXT PRIMARY KEY,
|
||||||
|
value TEXT NOT NULL,
|
||||||
|
description TEXT,
|
||||||
|
updated TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- 2. tiers - Subscription tiers (simplified)
|
||||||
|
-- ============================================================================
|
||||||
|
CREATE TABLE IF NOT EXISTS tiers (
|
||||||
|
id TEXT PRIMARY KEY, -- 'free', 'basic', 'premium', 'selfhosted'
|
||||||
|
name TEXT NOT NULL, -- Display name
|
||||||
|
description TEXT, -- Marketing description
|
||||||
|
price_monthly_cents INTEGER, -- NULL for free/selfhosted
|
||||||
|
price_yearly_cents INTEGER, -- NULL for free/selfhosted
|
||||||
|
stripe_price_id_monthly TEXT, -- Stripe Price ID (for v9d)
|
||||||
|
stripe_price_id_yearly TEXT, -- Stripe Price ID (for v9d)
|
||||||
|
active BOOLEAN DEFAULT true, -- Can new users subscribe?
|
||||||
|
sort_order INTEGER DEFAULT 0,
|
||||||
|
created TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- 3. features - Feature registry (all limitable features)
|
||||||
|
-- ============================================================================
|
||||||
|
CREATE TABLE IF NOT EXISTS features (
|
||||||
|
id TEXT PRIMARY KEY, -- 'weight_entries', 'ai_calls', 'photos', etc.
|
||||||
|
name TEXT NOT NULL, -- Display name
|
||||||
|
description TEXT, -- What is this feature?
|
||||||
|
category TEXT, -- 'data', 'ai', 'export', 'integration'
|
||||||
|
limit_type TEXT DEFAULT 'count', -- 'count', 'boolean', 'quota'
|
||||||
|
reset_period TEXT DEFAULT 'never', -- 'never', 'monthly', 'daily'
|
||||||
|
default_limit INTEGER, -- Fallback if no tier_limit defined
|
||||||
|
active BOOLEAN DEFAULT true, -- Is this feature currently used?
|
||||||
|
created TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- 4. tier_limits - Tier x Feature matrix
|
||||||
|
-- ============================================================================
|
||||||
|
CREATE TABLE IF NOT EXISTS tier_limits (
|
||||||
|
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
||||||
|
tier_id TEXT NOT NULL REFERENCES tiers(id) ON DELETE CASCADE,
|
||||||
|
feature_id TEXT NOT NULL REFERENCES features(id) ON DELETE CASCADE,
|
||||||
|
limit_value INTEGER, -- NULL = unlimited, 0 = disabled
|
||||||
|
created TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
UNIQUE(tier_id, feature_id)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- 5. user_feature_restrictions - Individual user overrides
|
||||||
|
-- ============================================================================
|
||||||
|
CREATE TABLE IF NOT EXISTS user_feature_restrictions (
|
||||||
|
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
feature_id TEXT NOT NULL REFERENCES features(id) ON DELETE CASCADE,
|
||||||
|
limit_value INTEGER, -- NULL = unlimited, 0 = disabled
|
||||||
|
reason TEXT, -- Why was this override applied?
|
||||||
|
created_by UUID, -- Admin profile_id
|
||||||
|
created TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
UNIQUE(profile_id, feature_id)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- 6. user_feature_usage - Usage tracking
|
||||||
|
-- ============================================================================
|
||||||
|
CREATE TABLE IF NOT EXISTS user_feature_usage (
|
||||||
|
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
feature_id TEXT NOT NULL REFERENCES features(id) ON DELETE CASCADE,
|
||||||
|
usage_count INTEGER DEFAULT 0,
|
||||||
|
reset_at TIMESTAMP, -- When does this counter reset?
|
||||||
|
created TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
UNIQUE(profile_id, feature_id)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- 7. coupons - Coupon management
|
||||||
|
-- ============================================================================
|
||||||
|
CREATE TABLE IF NOT EXISTS coupons (
|
||||||
|
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
||||||
|
code TEXT UNIQUE NOT NULL,
|
||||||
|
type TEXT NOT NULL, -- 'single_use', 'period', 'wellpass'
|
||||||
|
tier_id TEXT REFERENCES tiers(id) ON DELETE SET NULL,
|
||||||
|
duration_days INTEGER, -- For period/wellpass coupons
|
||||||
|
max_redemptions INTEGER, -- NULL = unlimited
|
||||||
|
redemption_count INTEGER DEFAULT 0,
|
||||||
|
valid_from TIMESTAMP,
|
||||||
|
valid_until TIMESTAMP,
|
||||||
|
active BOOLEAN DEFAULT true,
|
||||||
|
created_by UUID, -- Admin profile_id
|
||||||
|
description TEXT, -- Internal note
|
||||||
|
created TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- 8. coupon_redemptions - Redemption history
|
||||||
|
-- ============================================================================
|
||||||
|
CREATE TABLE IF NOT EXISTS coupon_redemptions (
|
||||||
|
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
||||||
|
coupon_id UUID NOT NULL REFERENCES coupons(id) ON DELETE CASCADE,
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
redeemed_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
access_grant_id UUID, -- FK to access_grants (created as result)
|
||||||
|
UNIQUE(coupon_id, profile_id) -- One redemption per user per coupon
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- 9. access_grants - Time-limited access grants
|
||||||
|
-- ============================================================================
|
||||||
|
CREATE TABLE IF NOT EXISTS access_grants (
|
||||||
|
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
tier_id TEXT NOT NULL REFERENCES tiers(id) ON DELETE CASCADE,
|
||||||
|
granted_by TEXT, -- 'coupon', 'admin', 'trial', 'subscription'
|
||||||
|
coupon_id UUID REFERENCES coupons(id) ON DELETE SET NULL,
|
||||||
|
valid_from TIMESTAMP NOT NULL,
|
||||||
|
valid_until TIMESTAMP NOT NULL,
|
||||||
|
is_active BOOLEAN DEFAULT true, -- Can be paused by Wellpass logic
|
||||||
|
paused_by UUID, -- access_grant.id that paused this
|
||||||
|
paused_at TIMESTAMP, -- When was it paused?
|
||||||
|
remaining_days INTEGER, -- Days left when paused (for resume)
|
||||||
|
created TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- 10. user_activity_log - Activity tracking
|
||||||
|
-- ============================================================================
|
||||||
|
CREATE TABLE IF NOT EXISTS user_activity_log (
|
||||||
|
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
action TEXT NOT NULL, -- 'login', 'logout', 'coupon_redeemed', 'tier_changed'
|
||||||
|
details JSONB, -- Flexible metadata
|
||||||
|
ip_address TEXT,
|
||||||
|
user_agent TEXT,
|
||||||
|
created TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_activity_log_profile ON user_activity_log(profile_id, created DESC);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_activity_log_action ON user_activity_log(action, created DESC);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- 11. user_stats - Aggregated statistics
|
||||||
|
-- ============================================================================
|
||||||
|
CREATE TABLE IF NOT EXISTS user_stats (
|
||||||
|
profile_id UUID PRIMARY KEY REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
last_login TIMESTAMP,
|
||||||
|
login_count INTEGER DEFAULT 0,
|
||||||
|
weight_entries_count INTEGER DEFAULT 0,
|
||||||
|
ai_calls_count INTEGER DEFAULT 0,
|
||||||
|
photos_count INTEGER DEFAULT 0,
|
||||||
|
total_data_points INTEGER DEFAULT 0,
|
||||||
|
created TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Extend profiles table with subscription fields
|
||||||
|
-- ============================================================================
|
||||||
|
ALTER TABLE profiles ADD COLUMN IF NOT EXISTS tier TEXT DEFAULT 'free';
|
||||||
|
ALTER TABLE profiles ADD COLUMN IF NOT EXISTS trial_ends_at TIMESTAMP;
|
||||||
|
ALTER TABLE profiles ADD COLUMN IF NOT EXISTS email_verified BOOLEAN DEFAULT false;
|
||||||
|
ALTER TABLE profiles ADD COLUMN IF NOT EXISTS email_verify_token TEXT;
|
||||||
|
ALTER TABLE profiles ADD COLUMN IF NOT EXISTS invited_by UUID REFERENCES profiles(id) ON DELETE SET NULL;
|
||||||
|
ALTER TABLE profiles ADD COLUMN IF NOT EXISTS invitation_token TEXT;
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Insert initial data
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- App settings
|
||||||
|
INSERT INTO app_settings (key, value, description) VALUES
|
||||||
|
('trial_duration_days', '14', 'Default trial duration for new registrations'),
|
||||||
|
('post_trial_tier', 'free', 'Tier after trial expires (free/disabled)'),
|
||||||
|
('require_email_verification', 'true', 'Require email verification before activation'),
|
||||||
|
('self_registration_enabled', 'true', 'Allow self-registration')
|
||||||
|
ON CONFLICT (key) DO NOTHING;
|
||||||
|
|
||||||
|
-- Tiers
|
||||||
|
INSERT INTO tiers (id, name, description, price_monthly_cents, price_yearly_cents, active, sort_order) VALUES
|
||||||
|
('free', 'Free', 'Eingeschränkte Basis-Funktionen', NULL, NULL, true, 1),
|
||||||
|
('basic', 'Basic', 'Kernfunktionen ohne KI', 499, 4990, true, 2),
|
||||||
|
('premium', 'Premium', 'Alle Features inkl. KI und Connectoren', 999, 9990, true, 3),
|
||||||
|
('selfhosted', 'Self-Hosted', 'Unbegrenzt (für Heimserver)', NULL, NULL, false, 4)
|
||||||
|
ON CONFLICT (id) DO NOTHING;
|
||||||
|
|
||||||
|
-- Features (11 initial features)
|
||||||
|
INSERT INTO features (id, name, description, category, limit_type, reset_period, default_limit, active) VALUES
|
||||||
|
('weight_entries', 'Gewichtseinträge', 'Anzahl Gewichtsmessungen', 'data', 'count', 'never', NULL, true),
|
||||||
|
('circumference_entries', 'Umfangs-Einträge', 'Anzahl Umfangsmessungen', 'data', 'count', 'never', NULL, true),
|
||||||
|
('caliper_entries', 'Caliper-Einträge', 'Anzahl Hautfaltenmessungen', 'data', 'count', 'never', NULL, true),
|
||||||
|
('nutrition_entries', 'Ernährungs-Einträge', 'Anzahl Ernährungslogs', 'data', 'count', 'never', NULL, true),
|
||||||
|
('activity_entries', 'Aktivitäts-Einträge', 'Anzahl Trainings/Aktivitäten', 'data', 'count', 'never', NULL, true),
|
||||||
|
('photos', 'Progress-Fotos', 'Anzahl hochgeladene Fotos', 'data', 'count', 'never', NULL, true),
|
||||||
|
('ai_calls', 'KI-Analysen', 'KI-Auswertungen pro Monat', 'ai', 'count', 'monthly', 0, true),
|
||||||
|
('ai_pipeline', 'KI-Pipeline', 'Vollständige Pipeline-Analyse', 'ai', 'boolean', 'never', 0, true),
|
||||||
|
('export_csv', 'CSV-Export', 'Daten als CSV exportieren', 'export', 'boolean', 'never', 0, true),
|
||||||
|
('export_json', 'JSON-Export', 'Daten als JSON exportieren', 'export', 'boolean', 'never', 0, true),
|
||||||
|
('export_zip', 'ZIP-Export', 'Vollständiger Backup-Export', 'export', 'boolean', 'never', 0, true)
|
||||||
|
ON CONFLICT (id) DO NOTHING;
|
||||||
|
|
||||||
|
-- Tier x Feature Matrix (tier_limits)
|
||||||
|
-- Format: (tier, feature, limit) - NULL = unlimited, 0 = disabled
|
||||||
|
|
||||||
|
-- FREE tier (sehr eingeschränkt)
|
||||||
|
INSERT INTO tier_limits (tier_id, feature_id, limit_value) VALUES
|
||||||
|
('free', 'weight_entries', 30),
|
||||||
|
('free', 'circumference_entries', 10),
|
||||||
|
('free', 'caliper_entries', 10),
|
||||||
|
('free', 'nutrition_entries', 30),
|
||||||
|
('free', 'activity_entries', 30),
|
||||||
|
('free', 'photos', 5),
|
||||||
|
('free', 'ai_calls', 0), -- Keine KI
|
||||||
|
('free', 'ai_pipeline', 0), -- Keine Pipeline
|
||||||
|
('free', 'export_csv', 0), -- Kein Export
|
||||||
|
('free', 'export_json', 0),
|
||||||
|
('free', 'export_zip', 0)
|
||||||
|
ON CONFLICT (tier_id, feature_id) DO NOTHING;
|
||||||
|
|
||||||
|
-- BASIC tier (Kernfunktionen)
|
||||||
|
INSERT INTO tier_limits (tier_id, feature_id, limit_value) VALUES
|
||||||
|
('basic', 'weight_entries', NULL), -- Unbegrenzt
|
||||||
|
('basic', 'circumference_entries', NULL),
|
||||||
|
('basic', 'caliper_entries', NULL),
|
||||||
|
('basic', 'nutrition_entries', NULL),
|
||||||
|
('basic', 'activity_entries', NULL),
|
||||||
|
('basic', 'photos', 50),
|
||||||
|
('basic', 'ai_calls', 3), -- 3 KI-Calls/Monat
|
||||||
|
('basic', 'ai_pipeline', 0), -- Keine Pipeline
|
||||||
|
('basic', 'export_csv', 1), -- Export erlaubt
|
||||||
|
('basic', 'export_json', 1),
|
||||||
|
('basic', 'export_zip', 1)
|
||||||
|
ON CONFLICT (tier_id, feature_id) DO NOTHING;
|
||||||
|
|
||||||
|
-- PREMIUM tier (alles unbegrenzt)
|
||||||
|
INSERT INTO tier_limits (tier_id, feature_id, limit_value) VALUES
|
||||||
|
('premium', 'weight_entries', NULL),
|
||||||
|
('premium', 'circumference_entries', NULL),
|
||||||
|
('premium', 'caliper_entries', NULL),
|
||||||
|
('premium', 'nutrition_entries', NULL),
|
||||||
|
('premium', 'activity_entries', NULL),
|
||||||
|
('premium', 'photos', NULL),
|
||||||
|
('premium', 'ai_calls', NULL), -- Unbegrenzt KI
|
||||||
|
('premium', 'ai_pipeline', 1), -- Pipeline erlaubt
|
||||||
|
('premium', 'export_csv', 1),
|
||||||
|
('premium', 'export_json', 1),
|
||||||
|
('premium', 'export_zip', 1)
|
||||||
|
ON CONFLICT (tier_id, feature_id) DO NOTHING;
|
||||||
|
|
||||||
|
-- SELFHOSTED tier (alles unbegrenzt)
|
||||||
|
INSERT INTO tier_limits (tier_id, feature_id, limit_value) VALUES
|
||||||
|
('selfhosted', 'weight_entries', NULL),
|
||||||
|
('selfhosted', 'circumference_entries', NULL),
|
||||||
|
('selfhosted', 'caliper_entries', NULL),
|
||||||
|
('selfhosted', 'nutrition_entries', NULL),
|
||||||
|
('selfhosted', 'activity_entries', NULL),
|
||||||
|
('selfhosted', 'photos', NULL),
|
||||||
|
('selfhosted', 'ai_calls', NULL),
|
||||||
|
('selfhosted', 'ai_pipeline', 1),
|
||||||
|
('selfhosted', 'export_csv', 1),
|
||||||
|
('selfhosted', 'export_json', 1),
|
||||||
|
('selfhosted', 'export_zip', 1)
|
||||||
|
ON CONFLICT (tier_id, feature_id) DO NOTHING;
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Migrate existing profiles
|
||||||
|
-- ============================================================================
|
||||||
|
|
||||||
|
-- Lars' Profile → selfhosted tier with email verified
|
||||||
|
UPDATE profiles
|
||||||
|
SET
|
||||||
|
tier = 'selfhosted',
|
||||||
|
email_verified = true
|
||||||
|
WHERE
|
||||||
|
email = 'lars@stommer.com'
|
||||||
|
OR role = 'admin';
|
||||||
|
|
||||||
|
-- Other existing profiles → free tier, unverified
|
||||||
|
UPDATE profiles
|
||||||
|
SET
|
||||||
|
tier = 'free',
|
||||||
|
email_verified = false
|
||||||
|
WHERE
|
||||||
|
tier IS NULL
|
||||||
|
OR tier = '';
|
||||||
|
|
||||||
|
-- Initialize user_stats for existing profiles
|
||||||
|
INSERT INTO user_stats (profile_id, weight_entries_count, photos_count)
|
||||||
|
SELECT
|
||||||
|
p.id,
|
||||||
|
(SELECT COUNT(*) FROM weight_log WHERE profile_id = p.id),
|
||||||
|
(SELECT COUNT(*) FROM photos WHERE profile_id = p.id)
|
||||||
|
FROM profiles p
|
||||||
|
ON CONFLICT (profile_id) DO NOTHING;
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Create indexes for performance
|
||||||
|
-- ============================================================================
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_tier_limits_tier ON tier_limits(tier_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_tier_limits_feature ON tier_limits(feature_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_user_restrictions_profile ON user_feature_restrictions(profile_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_user_usage_profile ON user_feature_usage(profile_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_access_grants_profile ON access_grants(profile_id, valid_until DESC);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_access_grants_active ON access_grants(profile_id, is_active, valid_until DESC);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_coupons_code ON coupons(code);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_coupon_redemptions_profile ON coupon_redemptions(profile_id);
|
||||||
|
|
||||||
|
-- ============================================================================
|
||||||
|
-- Migration complete
|
||||||
|
-- ============================================================================
|
||||||
|
-- Run this migration with:
|
||||||
|
-- psql -h localhost -U mitai_prod -d mitai_prod < backend/migrations/v9c_subscription_system.sql
|
||||||
|
--
|
||||||
|
-- Or via Docker:
|
||||||
|
-- docker exec -i mitai-postgres psql -U mitai_prod -d mitai_prod < backend/migrations/v9c_subscription_system.sql
|
||||||
|
-- ============================================================================
|
||||||
|
|
@ -27,6 +27,7 @@ class ProfileUpdate(BaseModel):
|
||||||
height: Optional[float] = None
|
height: Optional[float] = None
|
||||||
goal_weight: Optional[float] = None
|
goal_weight: Optional[float] = None
|
||||||
goal_bf_pct: Optional[float] = None
|
goal_bf_pct: Optional[float] = None
|
||||||
|
quality_filter_level: Optional[str] = None # Issue #31: Global quality filter
|
||||||
|
|
||||||
|
|
||||||
# ── Tracking Models ───────────────────────────────────────────────────────────
|
# ── Tracking Models ───────────────────────────────────────────────────────────
|
||||||
|
|
@ -84,6 +85,9 @@ class ActivityEntry(BaseModel):
|
||||||
rpe: Optional[int] = None
|
rpe: Optional[int] = None
|
||||||
source: Optional[str] = 'manual'
|
source: Optional[str] = 'manual'
|
||||||
notes: Optional[str] = None
|
notes: Optional[str] = None
|
||||||
|
training_type_id: Optional[int] = None # v9d: Training type categorization
|
||||||
|
training_category: Optional[str] = None # v9d: Denormalized category
|
||||||
|
training_subcategory: Optional[str] = None # v9d: Denormalized subcategory
|
||||||
|
|
||||||
|
|
||||||
class NutritionDay(BaseModel):
|
class NutritionDay(BaseModel):
|
||||||
|
|
@ -110,6 +114,12 @@ class PasswordResetConfirm(BaseModel):
|
||||||
new_password: str
|
new_password: str
|
||||||
|
|
||||||
|
|
||||||
|
class RegisterRequest(BaseModel):
|
||||||
|
name: str
|
||||||
|
email: str
|
||||||
|
password: str
|
||||||
|
|
||||||
|
|
||||||
# ── Admin Models ──────────────────────────────────────────────────────────────
|
# ── Admin Models ──────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
class AdminProfileUpdate(BaseModel):
|
class AdminProfileUpdate(BaseModel):
|
||||||
|
|
@ -117,3 +127,116 @@ class AdminProfileUpdate(BaseModel):
|
||||||
ai_enabled: Optional[int] = None
|
ai_enabled: Optional[int] = None
|
||||||
ai_limit_day: Optional[int] = None
|
ai_limit_day: Optional[int] = None
|
||||||
export_enabled: Optional[int] = None
|
export_enabled: Optional[int] = None
|
||||||
|
|
||||||
|
|
||||||
|
# ── Prompt Models (Issue #28) ────────────────────────────────────────────────
|
||||||
|
|
||||||
|
class PromptCreate(BaseModel):
|
||||||
|
name: str
|
||||||
|
slug: str
|
||||||
|
display_name: Optional[str] = None
|
||||||
|
description: Optional[str] = None
|
||||||
|
template: str
|
||||||
|
category: str = 'ganzheitlich'
|
||||||
|
active: bool = True
|
||||||
|
sort_order: int = 0
|
||||||
|
|
||||||
|
|
||||||
|
class PromptUpdate(BaseModel):
|
||||||
|
name: Optional[str] = None
|
||||||
|
display_name: Optional[str] = None
|
||||||
|
description: Optional[str] = None
|
||||||
|
template: Optional[str] = None
|
||||||
|
category: Optional[str] = None
|
||||||
|
active: Optional[bool] = None
|
||||||
|
sort_order: Optional[int] = None
|
||||||
|
|
||||||
|
|
||||||
|
class PromptGenerateRequest(BaseModel):
|
||||||
|
goal: str
|
||||||
|
data_categories: list[str]
|
||||||
|
example_output: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
|
# ── Unified Prompt System Models (Issue #28 Phase 2) ───────────────────────
|
||||||
|
|
||||||
|
class StagePromptCreate(BaseModel):
|
||||||
|
"""Single prompt within a stage"""
|
||||||
|
source: str # 'inline' or 'reference'
|
||||||
|
slug: Optional[str] = None # Required if source='reference'
|
||||||
|
template: Optional[str] = None # Required if source='inline'
|
||||||
|
output_key: str # Key for storing result (e.g., 'nutrition', 'stage1_body')
|
||||||
|
output_format: str = 'text' # 'text' or 'json'
|
||||||
|
output_schema: Optional[dict] = None # JSON schema if output_format='json'
|
||||||
|
|
||||||
|
|
||||||
|
class StageCreate(BaseModel):
|
||||||
|
"""Single stage with multiple prompts"""
|
||||||
|
stage: int # Stage number (1, 2, 3, ...)
|
||||||
|
prompts: list[StagePromptCreate]
|
||||||
|
|
||||||
|
|
||||||
|
class UnifiedPromptCreate(BaseModel):
|
||||||
|
"""Create a new unified prompt (base or pipeline type)"""
|
||||||
|
name: str
|
||||||
|
slug: str
|
||||||
|
display_name: Optional[str] = None
|
||||||
|
description: Optional[str] = None
|
||||||
|
type: str # 'base' or 'pipeline'
|
||||||
|
category: str = 'ganzheitlich'
|
||||||
|
active: bool = True
|
||||||
|
sort_order: int = 0
|
||||||
|
|
||||||
|
# For base prompts (single reusable template)
|
||||||
|
template: Optional[str] = None # Required if type='base'
|
||||||
|
output_format: str = 'text'
|
||||||
|
output_schema: Optional[dict] = None
|
||||||
|
|
||||||
|
# For pipeline prompts (multi-stage workflow)
|
||||||
|
stages: Optional[list[StageCreate]] = None # Required if type='pipeline'
|
||||||
|
|
||||||
|
|
||||||
|
class UnifiedPromptUpdate(BaseModel):
|
||||||
|
"""Update an existing unified prompt"""
|
||||||
|
name: Optional[str] = None
|
||||||
|
display_name: Optional[str] = None
|
||||||
|
description: Optional[str] = None
|
||||||
|
type: Optional[str] = None
|
||||||
|
category: Optional[str] = None
|
||||||
|
active: Optional[bool] = None
|
||||||
|
sort_order: Optional[int] = None
|
||||||
|
template: Optional[str] = None
|
||||||
|
output_format: Optional[str] = None
|
||||||
|
output_schema: Optional[dict] = None
|
||||||
|
stages: Optional[list[StageCreate]] = None
|
||||||
|
|
||||||
|
|
||||||
|
# ── Pipeline Config Models (Issue #28) ─────────────────────────────────────
|
||||||
|
# NOTE: These will be deprecated in favor of UnifiedPrompt models above
|
||||||
|
|
||||||
|
class PipelineConfigCreate(BaseModel):
|
||||||
|
name: str
|
||||||
|
description: Optional[str] = None
|
||||||
|
is_default: bool = False
|
||||||
|
active: bool = True
|
||||||
|
modules: dict # {"körper": true, "ernährung": true, ...}
|
||||||
|
timeframes: dict # {"körper": 30, "ernährung": 30, ...}
|
||||||
|
stage1_prompts: list[str] # Array of slugs
|
||||||
|
stage2_prompt: str # slug
|
||||||
|
stage3_prompt: Optional[str] = None # slug
|
||||||
|
|
||||||
|
|
||||||
|
class PipelineConfigUpdate(BaseModel):
|
||||||
|
name: Optional[str] = None
|
||||||
|
description: Optional[str] = None
|
||||||
|
is_default: Optional[bool] = None
|
||||||
|
active: Optional[bool] = None
|
||||||
|
modules: Optional[dict] = None
|
||||||
|
timeframes: Optional[dict] = None
|
||||||
|
stage1_prompts: Optional[list[str]] = None
|
||||||
|
stage2_prompt: Optional[str] = None
|
||||||
|
stage3_prompt: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
|
class PipelineExecuteRequest(BaseModel):
|
||||||
|
config_id: Optional[str] = None # None = use default config
|
||||||
|
|
|
||||||
715
backend/placeholder_resolver.py
Normal file
715
backend/placeholder_resolver.py
Normal file
|
|
@ -0,0 +1,715 @@
|
||||||
|
"""
|
||||||
|
Placeholder Resolver for AI Prompts
|
||||||
|
|
||||||
|
Provides a registry of placeholder functions that resolve to actual user data.
|
||||||
|
Used for prompt templates and preview functionality.
|
||||||
|
"""
|
||||||
|
import re
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from typing import Dict, List, Optional, Callable
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
|
||||||
|
|
||||||
|
# ── Helper Functions ──────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def get_profile_data(profile_id: str) -> Dict:
|
||||||
|
"""Load profile data for a user."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT * FROM profiles WHERE id=%s", (profile_id,))
|
||||||
|
return r2d(cur.fetchone()) if cur.rowcount > 0 else {}
|
||||||
|
|
||||||
|
|
||||||
|
def get_latest_weight(profile_id: str) -> Optional[str]:
|
||||||
|
"""Get latest weight entry."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"SELECT weight FROM weight_log WHERE profile_id=%s ORDER BY date DESC LIMIT 1",
|
||||||
|
(profile_id,)
|
||||||
|
)
|
||||||
|
row = cur.fetchone()
|
||||||
|
return f"{row['weight']:.1f} kg" if row else "nicht verfügbar"
|
||||||
|
|
||||||
|
|
||||||
|
def get_weight_trend(profile_id: str, days: int = 28) -> str:
|
||||||
|
"""Calculate weight trend description."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cutoff = (datetime.now() - timedelta(days=days)).strftime('%Y-%m-%d')
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT weight, date FROM weight_log
|
||||||
|
WHERE profile_id=%s AND date >= %s
|
||||||
|
ORDER BY date""",
|
||||||
|
(profile_id, cutoff)
|
||||||
|
)
|
||||||
|
rows = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
if len(rows) < 2:
|
||||||
|
return "nicht genug Daten"
|
||||||
|
|
||||||
|
first = rows[0]['weight']
|
||||||
|
last = rows[-1]['weight']
|
||||||
|
delta = last - first
|
||||||
|
|
||||||
|
if abs(delta) < 0.3:
|
||||||
|
return "stabil"
|
||||||
|
elif delta > 0:
|
||||||
|
return f"steigend (+{delta:.1f} kg in {days} Tagen)"
|
||||||
|
else:
|
||||||
|
return f"sinkend ({delta:.1f} kg in {days} Tagen)"
|
||||||
|
|
||||||
|
|
||||||
|
def get_latest_bf(profile_id: str) -> Optional[str]:
|
||||||
|
"""Get latest body fat percentage from caliper."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT body_fat_pct FROM caliper_log
|
||||||
|
WHERE profile_id=%s AND body_fat_pct IS NOT NULL
|
||||||
|
ORDER BY date DESC LIMIT 1""",
|
||||||
|
(profile_id,)
|
||||||
|
)
|
||||||
|
row = cur.fetchone()
|
||||||
|
return f"{row['body_fat_pct']:.1f}%" if row else "nicht verfügbar"
|
||||||
|
|
||||||
|
|
||||||
|
def get_nutrition_avg(profile_id: str, field: str, days: int = 30) -> str:
|
||||||
|
"""Calculate average nutrition value."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cutoff = (datetime.now() - timedelta(days=days)).strftime('%Y-%m-%d')
|
||||||
|
|
||||||
|
# Map field names to actual column names
|
||||||
|
field_map = {
|
||||||
|
'protein': 'protein_g',
|
||||||
|
'fat': 'fat_g',
|
||||||
|
'carb': 'carbs_g',
|
||||||
|
'kcal': 'kcal'
|
||||||
|
}
|
||||||
|
db_field = field_map.get(field, field)
|
||||||
|
|
||||||
|
cur.execute(
|
||||||
|
f"""SELECT AVG({db_field}) as avg FROM nutrition_log
|
||||||
|
WHERE profile_id=%s AND date >= %s AND {db_field} IS NOT NULL""",
|
||||||
|
(profile_id, cutoff)
|
||||||
|
)
|
||||||
|
row = cur.fetchone()
|
||||||
|
if row and row['avg']:
|
||||||
|
if field == 'kcal':
|
||||||
|
return f"{int(row['avg'])} kcal/Tag (Ø {days} Tage)"
|
||||||
|
else:
|
||||||
|
return f"{int(row['avg'])}g/Tag (Ø {days} Tage)"
|
||||||
|
return "nicht verfügbar"
|
||||||
|
|
||||||
|
|
||||||
|
def get_caliper_summary(profile_id: str) -> str:
|
||||||
|
"""Get latest caliper measurements summary."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT body_fat_pct, sf_method, date FROM caliper_log
|
||||||
|
WHERE profile_id=%s AND body_fat_pct IS NOT NULL
|
||||||
|
ORDER BY date DESC LIMIT 1""",
|
||||||
|
(profile_id,)
|
||||||
|
)
|
||||||
|
row = r2d(cur.fetchone()) if cur.rowcount > 0 else None
|
||||||
|
|
||||||
|
if not row:
|
||||||
|
return "keine Caliper-Messungen"
|
||||||
|
|
||||||
|
method = row.get('sf_method', 'unbekannt')
|
||||||
|
return f"{row['body_fat_pct']:.1f}% ({method} am {row['date']})"
|
||||||
|
|
||||||
|
|
||||||
|
def get_circ_summary(profile_id: str) -> str:
|
||||||
|
"""Get latest circumference measurements summary with age annotations.
|
||||||
|
|
||||||
|
For each measurement point, fetches the most recent value (even if from different dates).
|
||||||
|
Annotates each value with measurement age for AI context.
|
||||||
|
"""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Define all circumference points with their labels
|
||||||
|
fields = [
|
||||||
|
('c_neck', 'Nacken'),
|
||||||
|
('c_chest', 'Brust'),
|
||||||
|
('c_waist', 'Taille'),
|
||||||
|
('c_belly', 'Bauch'),
|
||||||
|
('c_hip', 'Hüfte'),
|
||||||
|
('c_thigh', 'Oberschenkel'),
|
||||||
|
('c_calf', 'Wade'),
|
||||||
|
('c_arm', 'Arm')
|
||||||
|
]
|
||||||
|
|
||||||
|
parts = []
|
||||||
|
today = datetime.now().date()
|
||||||
|
|
||||||
|
# Get latest value for each field individually
|
||||||
|
for field_name, label in fields:
|
||||||
|
cur.execute(
|
||||||
|
f"""SELECT {field_name}, date,
|
||||||
|
CURRENT_DATE - date AS age_days
|
||||||
|
FROM circumference_log
|
||||||
|
WHERE profile_id=%s AND {field_name} IS NOT NULL
|
||||||
|
ORDER BY date DESC LIMIT 1""",
|
||||||
|
(profile_id,)
|
||||||
|
)
|
||||||
|
row = r2d(cur.fetchone()) if cur.rowcount > 0 else None
|
||||||
|
|
||||||
|
if row:
|
||||||
|
value = row[field_name]
|
||||||
|
age_days = row['age_days']
|
||||||
|
|
||||||
|
# Format age annotation
|
||||||
|
if age_days == 0:
|
||||||
|
age_str = "heute"
|
||||||
|
elif age_days == 1:
|
||||||
|
age_str = "gestern"
|
||||||
|
elif age_days <= 7:
|
||||||
|
age_str = f"vor {age_days} Tagen"
|
||||||
|
elif age_days <= 30:
|
||||||
|
weeks = age_days // 7
|
||||||
|
age_str = f"vor {weeks} Woche{'n' if weeks > 1 else ''}"
|
||||||
|
else:
|
||||||
|
months = age_days // 30
|
||||||
|
age_str = f"vor {months} Monat{'en' if months > 1 else ''}"
|
||||||
|
|
||||||
|
parts.append(f"{label} {value:.1f}cm ({age_str})")
|
||||||
|
|
||||||
|
return ', '.join(parts) if parts else "keine Umfangsmessungen"
|
||||||
|
|
||||||
|
|
||||||
|
def get_goal_weight(profile_id: str) -> str:
|
||||||
|
"""Get goal weight from profile."""
|
||||||
|
profile = get_profile_data(profile_id)
|
||||||
|
goal = profile.get('goal_weight')
|
||||||
|
return f"{goal:.1f}" if goal else "nicht gesetzt"
|
||||||
|
|
||||||
|
|
||||||
|
def get_goal_bf_pct(profile_id: str) -> str:
|
||||||
|
"""Get goal body fat percentage from profile."""
|
||||||
|
profile = get_profile_data(profile_id)
|
||||||
|
goal = profile.get('goal_bf_pct')
|
||||||
|
return f"{goal:.1f}" if goal else "nicht gesetzt"
|
||||||
|
|
||||||
|
|
||||||
|
def get_nutrition_days(profile_id: str, days: int = 30) -> str:
|
||||||
|
"""Get number of days with nutrition data."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cutoff = (datetime.now() - timedelta(days=days)).strftime('%Y-%m-%d')
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT COUNT(DISTINCT date) as days FROM nutrition_log
|
||||||
|
WHERE profile_id=%s AND date >= %s""",
|
||||||
|
(profile_id, cutoff)
|
||||||
|
)
|
||||||
|
row = cur.fetchone()
|
||||||
|
return str(row['days']) if row else "0"
|
||||||
|
|
||||||
|
|
||||||
|
def get_protein_ziel_low(profile_id: str) -> str:
|
||||||
|
"""Calculate lower protein target based on current weight (1.6g/kg)."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT weight FROM weight_log
|
||||||
|
WHERE profile_id=%s ORDER BY date DESC LIMIT 1""",
|
||||||
|
(profile_id,)
|
||||||
|
)
|
||||||
|
row = cur.fetchone()
|
||||||
|
if row:
|
||||||
|
return f"{int(float(row['weight']) * 1.6)}"
|
||||||
|
return "nicht verfügbar"
|
||||||
|
|
||||||
|
|
||||||
|
def get_protein_ziel_high(profile_id: str) -> str:
|
||||||
|
"""Calculate upper protein target based on current weight (2.2g/kg)."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT weight FROM weight_log
|
||||||
|
WHERE profile_id=%s ORDER BY date DESC LIMIT 1""",
|
||||||
|
(profile_id,)
|
||||||
|
)
|
||||||
|
row = cur.fetchone()
|
||||||
|
if row:
|
||||||
|
return f"{int(float(row['weight']) * 2.2)}"
|
||||||
|
return "nicht verfügbar"
|
||||||
|
|
||||||
|
|
||||||
|
def get_activity_summary(profile_id: str, days: int = 14) -> str:
|
||||||
|
"""Get activity summary for recent period."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cutoff = (datetime.now() - timedelta(days=days)).strftime('%Y-%m-%d')
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT COUNT(*) as count,
|
||||||
|
SUM(duration_min) as total_min,
|
||||||
|
SUM(kcal_active) as total_kcal
|
||||||
|
FROM activity_log
|
||||||
|
WHERE profile_id=%s AND date >= %s""",
|
||||||
|
(profile_id, cutoff)
|
||||||
|
)
|
||||||
|
row = r2d(cur.fetchone())
|
||||||
|
|
||||||
|
if row['count'] == 0:
|
||||||
|
return f"Keine Aktivitäten in den letzten {days} Tagen"
|
||||||
|
|
||||||
|
avg_min = int(row['total_min'] / row['count']) if row['total_min'] else 0
|
||||||
|
return f"{row['count']} Einheiten in {days} Tagen (Ø {avg_min} min/Einheit, {int(row['total_kcal'] or 0)} kcal gesamt)"
|
||||||
|
|
||||||
|
|
||||||
|
def calculate_age(dob) -> str:
|
||||||
|
"""Calculate age from date of birth (accepts date object or string)."""
|
||||||
|
if not dob:
|
||||||
|
return "unbekannt"
|
||||||
|
try:
|
||||||
|
# Handle both datetime.date objects and strings
|
||||||
|
if isinstance(dob, str):
|
||||||
|
birth = datetime.strptime(dob, '%Y-%m-%d').date()
|
||||||
|
else:
|
||||||
|
birth = dob # Already a date object from PostgreSQL
|
||||||
|
|
||||||
|
today = datetime.now().date()
|
||||||
|
age = today.year - birth.year - ((today.month, today.day) < (birth.month, birth.day))
|
||||||
|
return str(age)
|
||||||
|
except Exception as e:
|
||||||
|
return "unbekannt"
|
||||||
|
|
||||||
|
|
||||||
|
def get_activity_detail(profile_id: str, days: int = 14) -> str:
|
||||||
|
"""Get detailed activity log for analysis."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cutoff = (datetime.now() - timedelta(days=days)).strftime('%Y-%m-%d')
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT date, activity_type, duration_min, kcal_active, hr_avg
|
||||||
|
FROM activity_log
|
||||||
|
WHERE profile_id=%s AND date >= %s
|
||||||
|
ORDER BY date DESC
|
||||||
|
LIMIT 50""",
|
||||||
|
(profile_id, cutoff)
|
||||||
|
)
|
||||||
|
rows = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
if not rows:
|
||||||
|
return f"Keine Aktivitäten in den letzten {days} Tagen"
|
||||||
|
|
||||||
|
# Format as readable list
|
||||||
|
lines = []
|
||||||
|
for r in rows:
|
||||||
|
hr_str = f" HF={r['hr_avg']}" if r.get('hr_avg') else ""
|
||||||
|
lines.append(
|
||||||
|
f"{r['date']}: {r['activity_type']} ({r['duration_min']}min, {r.get('kcal_active', 0)}kcal{hr_str})"
|
||||||
|
)
|
||||||
|
|
||||||
|
return '\n'.join(lines[:20]) # Max 20 entries to avoid token bloat
|
||||||
|
|
||||||
|
|
||||||
|
def get_trainingstyp_verteilung(profile_id: str, days: int = 14) -> str:
|
||||||
|
"""Get training type distribution."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cutoff = (datetime.now() - timedelta(days=days)).strftime('%Y-%m-%d')
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT training_category, COUNT(*) as count
|
||||||
|
FROM activity_log
|
||||||
|
WHERE profile_id=%s AND date >= %s AND training_category IS NOT NULL
|
||||||
|
GROUP BY training_category
|
||||||
|
ORDER BY count DESC""",
|
||||||
|
(profile_id, cutoff)
|
||||||
|
)
|
||||||
|
rows = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
if not rows:
|
||||||
|
return "Keine kategorisierten Trainings"
|
||||||
|
|
||||||
|
total = sum(r['count'] for r in rows)
|
||||||
|
parts = [f"{r['training_category']}: {int(r['count']/total*100)}%" for r in rows[:3]]
|
||||||
|
return ", ".join(parts)
|
||||||
|
|
||||||
|
|
||||||
|
def get_sleep_avg_duration(profile_id: str, days: int = 7) -> str:
|
||||||
|
"""Calculate average sleep duration in hours."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cutoff = (datetime.now() - timedelta(days=days)).strftime('%Y-%m-%d')
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT sleep_segments FROM sleep_log
|
||||||
|
WHERE profile_id=%s AND date >= %s
|
||||||
|
ORDER BY date DESC""",
|
||||||
|
(profile_id, cutoff)
|
||||||
|
)
|
||||||
|
rows = cur.fetchall()
|
||||||
|
|
||||||
|
if not rows:
|
||||||
|
return "nicht verfügbar"
|
||||||
|
|
||||||
|
total_minutes = 0
|
||||||
|
for row in rows:
|
||||||
|
segments = row['sleep_segments']
|
||||||
|
if segments:
|
||||||
|
# Sum duration_min from all segments
|
||||||
|
for seg in segments:
|
||||||
|
total_minutes += seg.get('duration_min', 0)
|
||||||
|
|
||||||
|
if total_minutes == 0:
|
||||||
|
return "nicht verfügbar"
|
||||||
|
|
||||||
|
avg_hours = total_minutes / len(rows) / 60
|
||||||
|
return f"{avg_hours:.1f}h"
|
||||||
|
|
||||||
|
|
||||||
|
def get_sleep_avg_quality(profile_id: str, days: int = 7) -> str:
|
||||||
|
"""Calculate average sleep quality (Deep+REM %)."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cutoff = (datetime.now() - timedelta(days=days)).strftime('%Y-%m-%d')
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT sleep_segments FROM sleep_log
|
||||||
|
WHERE profile_id=%s AND date >= %s
|
||||||
|
ORDER BY date DESC""",
|
||||||
|
(profile_id, cutoff)
|
||||||
|
)
|
||||||
|
rows = cur.fetchall()
|
||||||
|
|
||||||
|
if not rows:
|
||||||
|
return "nicht verfügbar"
|
||||||
|
|
||||||
|
total_quality = 0
|
||||||
|
count = 0
|
||||||
|
for row in rows:
|
||||||
|
segments = row['sleep_segments']
|
||||||
|
if segments:
|
||||||
|
# Note: segments use 'phase' key (not 'stage'), stored lowercase (deep, rem, light, awake)
|
||||||
|
deep_rem_min = sum(s.get('duration_min', 0) for s in segments if s.get('phase') in ['deep', 'rem'])
|
||||||
|
total_min = sum(s.get('duration_min', 0) for s in segments)
|
||||||
|
if total_min > 0:
|
||||||
|
quality_pct = (deep_rem_min / total_min) * 100
|
||||||
|
total_quality += quality_pct
|
||||||
|
count += 1
|
||||||
|
|
||||||
|
if count == 0:
|
||||||
|
return "nicht verfügbar"
|
||||||
|
|
||||||
|
avg_quality = total_quality / count
|
||||||
|
return f"{avg_quality:.0f}% (Deep+REM)"
|
||||||
|
|
||||||
|
|
||||||
|
def get_rest_days_count(profile_id: str, days: int = 30) -> str:
|
||||||
|
"""Count rest days in the given period."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cutoff = (datetime.now() - timedelta(days=days)).strftime('%Y-%m-%d')
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT COUNT(DISTINCT date) as count FROM rest_days
|
||||||
|
WHERE profile_id=%s AND date >= %s""",
|
||||||
|
(profile_id, cutoff)
|
||||||
|
)
|
||||||
|
row = cur.fetchone()
|
||||||
|
count = row['count'] if row else 0
|
||||||
|
return f"{count} Ruhetage"
|
||||||
|
|
||||||
|
|
||||||
|
def get_vitals_avg_hr(profile_id: str, days: int = 7) -> str:
|
||||||
|
"""Calculate average resting heart rate."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cutoff = (datetime.now() - timedelta(days=days)).strftime('%Y-%m-%d')
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT AVG(resting_hr) as avg FROM vitals_baseline
|
||||||
|
WHERE profile_id=%s AND date >= %s AND resting_hr IS NOT NULL""",
|
||||||
|
(profile_id, cutoff)
|
||||||
|
)
|
||||||
|
row = cur.fetchone()
|
||||||
|
|
||||||
|
if row and row['avg']:
|
||||||
|
return f"{int(row['avg'])} bpm"
|
||||||
|
return "nicht verfügbar"
|
||||||
|
|
||||||
|
|
||||||
|
def get_vitals_avg_hrv(profile_id: str, days: int = 7) -> str:
|
||||||
|
"""Calculate average heart rate variability."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cutoff = (datetime.now() - timedelta(days=days)).strftime('%Y-%m-%d')
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT AVG(hrv) as avg FROM vitals_baseline
|
||||||
|
WHERE profile_id=%s AND date >= %s AND hrv IS NOT NULL""",
|
||||||
|
(profile_id, cutoff)
|
||||||
|
)
|
||||||
|
row = cur.fetchone()
|
||||||
|
|
||||||
|
if row and row['avg']:
|
||||||
|
return f"{int(row['avg'])} ms"
|
||||||
|
return "nicht verfügbar"
|
||||||
|
|
||||||
|
|
||||||
|
def get_vitals_vo2_max(profile_id: str) -> str:
|
||||||
|
"""Get latest VO2 Max value."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT vo2_max FROM vitals_baseline
|
||||||
|
WHERE profile_id=%s AND vo2_max IS NOT NULL
|
||||||
|
ORDER BY date DESC LIMIT 1""",
|
||||||
|
(profile_id,)
|
||||||
|
)
|
||||||
|
row = cur.fetchone()
|
||||||
|
|
||||||
|
if row and row['vo2_max']:
|
||||||
|
return f"{row['vo2_max']:.1f} ml/kg/min"
|
||||||
|
return "nicht verfügbar"
|
||||||
|
|
||||||
|
|
||||||
|
# ── Placeholder Registry ──────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
PLACEHOLDER_MAP: Dict[str, Callable[[str], str]] = {
|
||||||
|
# Profil
|
||||||
|
'{{name}}': lambda pid: get_profile_data(pid).get('name', 'Nutzer'),
|
||||||
|
'{{age}}': lambda pid: calculate_age(get_profile_data(pid).get('dob')),
|
||||||
|
'{{height}}': lambda pid: str(get_profile_data(pid).get('height', 'unbekannt')),
|
||||||
|
'{{geschlecht}}': lambda pid: 'männlich' if get_profile_data(pid).get('sex') == 'm' else 'weiblich',
|
||||||
|
|
||||||
|
# Körper
|
||||||
|
'{{weight_aktuell}}': get_latest_weight,
|
||||||
|
'{{weight_trend}}': get_weight_trend,
|
||||||
|
'{{kf_aktuell}}': get_latest_bf,
|
||||||
|
'{{bmi}}': lambda pid: calculate_bmi(pid),
|
||||||
|
'{{caliper_summary}}': get_caliper_summary,
|
||||||
|
'{{circ_summary}}': get_circ_summary,
|
||||||
|
'{{goal_weight}}': get_goal_weight,
|
||||||
|
'{{goal_bf_pct}}': get_goal_bf_pct,
|
||||||
|
|
||||||
|
# Ernährung
|
||||||
|
'{{kcal_avg}}': lambda pid: get_nutrition_avg(pid, 'kcal', 30),
|
||||||
|
'{{protein_avg}}': lambda pid: get_nutrition_avg(pid, 'protein', 30),
|
||||||
|
'{{carb_avg}}': lambda pid: get_nutrition_avg(pid, 'carb', 30),
|
||||||
|
'{{fat_avg}}': lambda pid: get_nutrition_avg(pid, 'fat', 30),
|
||||||
|
'{{nutrition_days}}': lambda pid: get_nutrition_days(pid, 30),
|
||||||
|
'{{protein_ziel_low}}': get_protein_ziel_low,
|
||||||
|
'{{protein_ziel_high}}': get_protein_ziel_high,
|
||||||
|
|
||||||
|
# Training
|
||||||
|
'{{activity_summary}}': get_activity_summary,
|
||||||
|
'{{activity_detail}}': get_activity_detail,
|
||||||
|
'{{trainingstyp_verteilung}}': get_trainingstyp_verteilung,
|
||||||
|
|
||||||
|
# Schlaf & Erholung
|
||||||
|
'{{sleep_avg_duration}}': lambda pid: get_sleep_avg_duration(pid, 7),
|
||||||
|
'{{sleep_avg_quality}}': lambda pid: get_sleep_avg_quality(pid, 7),
|
||||||
|
'{{rest_days_count}}': lambda pid: get_rest_days_count(pid, 30),
|
||||||
|
|
||||||
|
# Vitalwerte
|
||||||
|
'{{vitals_avg_hr}}': lambda pid: get_vitals_avg_hr(pid, 7),
|
||||||
|
'{{vitals_avg_hrv}}': lambda pid: get_vitals_avg_hrv(pid, 7),
|
||||||
|
'{{vitals_vo2_max}}': get_vitals_vo2_max,
|
||||||
|
|
||||||
|
# Zeitraum
|
||||||
|
'{{datum_heute}}': lambda pid: datetime.now().strftime('%d.%m.%Y'),
|
||||||
|
'{{zeitraum_7d}}': lambda pid: 'letzte 7 Tage',
|
||||||
|
'{{zeitraum_30d}}': lambda pid: 'letzte 30 Tage',
|
||||||
|
'{{zeitraum_90d}}': lambda pid: 'letzte 90 Tage',
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def calculate_bmi(profile_id: str) -> str:
|
||||||
|
"""Calculate BMI from latest weight and profile height."""
|
||||||
|
profile = get_profile_data(profile_id)
|
||||||
|
if not profile.get('height'):
|
||||||
|
return "nicht verfügbar"
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"SELECT weight FROM weight_log WHERE profile_id=%s ORDER BY date DESC LIMIT 1",
|
||||||
|
(profile_id,)
|
||||||
|
)
|
||||||
|
row = cur.fetchone()
|
||||||
|
if not row:
|
||||||
|
return "nicht verfügbar"
|
||||||
|
|
||||||
|
height_m = profile['height'] / 100
|
||||||
|
bmi = row['weight'] / (height_m ** 2)
|
||||||
|
return f"{bmi:.1f}"
|
||||||
|
|
||||||
|
|
||||||
|
# ── Public API ────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def resolve_placeholders(template: str, profile_id: str) -> str:
|
||||||
|
"""
|
||||||
|
Replace all {{placeholders}} in template with actual user data.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
template: Prompt template with placeholders
|
||||||
|
profile_id: User profile ID
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Resolved template with placeholders replaced by values
|
||||||
|
"""
|
||||||
|
result = template
|
||||||
|
|
||||||
|
for placeholder, resolver in PLACEHOLDER_MAP.items():
|
||||||
|
if placeholder in result:
|
||||||
|
try:
|
||||||
|
value = resolver(profile_id)
|
||||||
|
result = result.replace(placeholder, str(value))
|
||||||
|
except Exception as e:
|
||||||
|
# On error, replace with error message
|
||||||
|
result = result.replace(placeholder, f"[Fehler: {placeholder}]")
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def get_unknown_placeholders(template: str) -> List[str]:
|
||||||
|
"""
|
||||||
|
Find all placeholders in template that are not in PLACEHOLDER_MAP.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
template: Prompt template
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of unknown placeholder names (without {{}})
|
||||||
|
"""
|
||||||
|
# Find all {{...}} patterns
|
||||||
|
found = re.findall(r'\{\{(\w+)\}\}', template)
|
||||||
|
|
||||||
|
# Filter to only unknown ones
|
||||||
|
known_names = {p.strip('{}') for p in PLACEHOLDER_MAP.keys()}
|
||||||
|
unknown = [p for p in found if p not in known_names]
|
||||||
|
|
||||||
|
return list(set(unknown)) # Remove duplicates
|
||||||
|
|
||||||
|
|
||||||
|
def get_available_placeholders(categories: Optional[List[str]] = None) -> Dict[str, List[str]]:
|
||||||
|
"""
|
||||||
|
Get available placeholders, optionally filtered by categories.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
categories: Optional list of categories to filter (körper, ernährung, training, etc.)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict mapping category to list of placeholders
|
||||||
|
"""
|
||||||
|
placeholder_categories = {
|
||||||
|
'profil': [
|
||||||
|
'{{name}}', '{{age}}', '{{height}}', '{{geschlecht}}'
|
||||||
|
],
|
||||||
|
'körper': [
|
||||||
|
'{{weight_aktuell}}', '{{weight_trend}}', '{{kf_aktuell}}', '{{bmi}}'
|
||||||
|
],
|
||||||
|
'ernährung': [
|
||||||
|
'{{kcal_avg}}', '{{protein_avg}}', '{{carb_avg}}', '{{fat_avg}}'
|
||||||
|
],
|
||||||
|
'training': [
|
||||||
|
'{{activity_summary}}', '{{trainingstyp_verteilung}}'
|
||||||
|
],
|
||||||
|
'zeitraum': [
|
||||||
|
'{{datum_heute}}', '{{zeitraum_7d}}', '{{zeitraum_30d}}', '{{zeitraum_90d}}'
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
if not categories:
|
||||||
|
return placeholder_categories
|
||||||
|
|
||||||
|
# Filter to requested categories
|
||||||
|
return {k: v for k, v in placeholder_categories.items() if k in categories}
|
||||||
|
|
||||||
|
|
||||||
|
def get_placeholder_example_values(profile_id: str) -> Dict[str, str]:
|
||||||
|
"""
|
||||||
|
Get example values for all placeholders using real user data.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
profile_id: User profile ID
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict mapping placeholder to example value
|
||||||
|
"""
|
||||||
|
examples = {}
|
||||||
|
|
||||||
|
for placeholder, resolver in PLACEHOLDER_MAP.items():
|
||||||
|
try:
|
||||||
|
examples[placeholder] = resolver(profile_id)
|
||||||
|
except Exception as e:
|
||||||
|
examples[placeholder] = f"[Fehler: {str(e)}]"
|
||||||
|
|
||||||
|
return examples
|
||||||
|
|
||||||
|
|
||||||
|
def get_placeholder_catalog(profile_id: str) -> Dict[str, List[Dict[str, str]]]:
|
||||||
|
"""
|
||||||
|
Get grouped placeholder catalog with descriptions and example values.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
profile_id: User profile ID
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict mapping category to list of {key, description, example}
|
||||||
|
"""
|
||||||
|
# Placeholder definitions with descriptions
|
||||||
|
placeholders = {
|
||||||
|
'Profil': [
|
||||||
|
('name', 'Name des Nutzers'),
|
||||||
|
('age', 'Alter in Jahren'),
|
||||||
|
('height', 'Körpergröße in cm'),
|
||||||
|
('geschlecht', 'Geschlecht'),
|
||||||
|
],
|
||||||
|
'Körper': [
|
||||||
|
('weight_aktuell', 'Aktuelles Gewicht in kg'),
|
||||||
|
('weight_trend', 'Gewichtstrend (7d/30d)'),
|
||||||
|
('kf_aktuell', 'Aktueller Körperfettanteil in %'),
|
||||||
|
('bmi', 'Body Mass Index'),
|
||||||
|
],
|
||||||
|
'Ernährung': [
|
||||||
|
('kcal_avg', 'Durchschn. Kalorien (30d)'),
|
||||||
|
('protein_avg', 'Durchschn. Protein in g (30d)'),
|
||||||
|
('carb_avg', 'Durchschn. Kohlenhydrate in g (30d)'),
|
||||||
|
('fat_avg', 'Durchschn. Fett in g (30d)'),
|
||||||
|
],
|
||||||
|
'Training': [
|
||||||
|
('activity_summary', 'Aktivitäts-Zusammenfassung (7d)'),
|
||||||
|
('trainingstyp_verteilung', 'Verteilung nach Trainingstypen'),
|
||||||
|
],
|
||||||
|
'Schlaf & Erholung': [
|
||||||
|
('sleep_avg_duration', 'Durchschn. Schlafdauer (7d)'),
|
||||||
|
('sleep_avg_quality', 'Durchschn. Schlafqualität (7d)'),
|
||||||
|
('rest_days_count', 'Anzahl Ruhetage (30d)'),
|
||||||
|
],
|
||||||
|
'Vitalwerte': [
|
||||||
|
('vitals_avg_hr', 'Durchschn. Ruhepuls (7d)'),
|
||||||
|
('vitals_avg_hrv', 'Durchschn. HRV (7d)'),
|
||||||
|
('vitals_vo2_max', 'Aktueller VO2 Max'),
|
||||||
|
],
|
||||||
|
'Zeitraum': [
|
||||||
|
('datum_heute', 'Heutiges Datum'),
|
||||||
|
('zeitraum_7d', '7-Tage-Zeitraum'),
|
||||||
|
('zeitraum_30d', '30-Tage-Zeitraum'),
|
||||||
|
],
|
||||||
|
}
|
||||||
|
|
||||||
|
catalog = {}
|
||||||
|
|
||||||
|
for category, items in placeholders.items():
|
||||||
|
catalog[category] = []
|
||||||
|
for key, description in items:
|
||||||
|
placeholder = f'{{{{{key}}}}}'
|
||||||
|
# Get example value if resolver exists
|
||||||
|
resolver = PLACEHOLDER_MAP.get(placeholder)
|
||||||
|
if resolver:
|
||||||
|
try:
|
||||||
|
example = resolver(profile_id)
|
||||||
|
except Exception:
|
||||||
|
example = '[Nicht verfügbar]'
|
||||||
|
else:
|
||||||
|
example = '[Nicht implementiert]'
|
||||||
|
|
||||||
|
catalog[category].append({
|
||||||
|
'key': key,
|
||||||
|
'description': description,
|
||||||
|
'example': str(example)
|
||||||
|
})
|
||||||
|
|
||||||
|
return catalog
|
||||||
349
backend/profile_evaluator.py
Normal file
349
backend/profile_evaluator.py
Normal file
|
|
@ -0,0 +1,349 @@
|
||||||
|
"""
|
||||||
|
Training Type Profiles - Master Evaluator
|
||||||
|
Comprehensive activity evaluation across all 7 dimensions.
|
||||||
|
|
||||||
|
Issue: #15
|
||||||
|
Date: 2026-03-23
|
||||||
|
"""
|
||||||
|
from typing import Dict, Optional, List
|
||||||
|
from datetime import datetime
|
||||||
|
import logging
|
||||||
|
|
||||||
|
from rule_engine import RuleEvaluator, IntensityZoneEvaluator, TrainingEffectsEvaluator
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class TrainingProfileEvaluator:
|
||||||
|
"""
|
||||||
|
Master class for comprehensive activity evaluation.
|
||||||
|
|
||||||
|
Evaluates an activity against a training type profile across 7 dimensions:
|
||||||
|
1. Minimum Requirements (Quality Gates)
|
||||||
|
2. Intensity Zones (HR zones)
|
||||||
|
3. Training Effects (Abilities)
|
||||||
|
4. Periodization (Frequency & Recovery)
|
||||||
|
5. Performance Indicators (KPIs)
|
||||||
|
6. Safety (Warnings)
|
||||||
|
7. AI Context
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, parameters_registry: Dict[str, Dict]):
|
||||||
|
"""
|
||||||
|
Initialize evaluator with parameter registry.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
parameters_registry: Dict mapping parameter_key -> config
|
||||||
|
"""
|
||||||
|
self.parameters_registry = parameters_registry
|
||||||
|
self.rule_evaluator = RuleEvaluator()
|
||||||
|
self.zone_evaluator = IntensityZoneEvaluator()
|
||||||
|
self.effects_evaluator = TrainingEffectsEvaluator()
|
||||||
|
|
||||||
|
def evaluate_activity(
|
||||||
|
self,
|
||||||
|
activity: Dict,
|
||||||
|
training_type_profile: Optional[Dict],
|
||||||
|
context: Optional[Dict] = None
|
||||||
|
) -> Dict:
|
||||||
|
"""
|
||||||
|
Complete evaluation of an activity against its training type profile.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
activity: Activity data dictionary
|
||||||
|
training_type_profile: Training type profile (JSONB)
|
||||||
|
context: {
|
||||||
|
"user_profile": {...},
|
||||||
|
"recent_activities": [...],
|
||||||
|
"historical_activities": [...]
|
||||||
|
}
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
{
|
||||||
|
"evaluated_at": ISO timestamp,
|
||||||
|
"profile_version": str,
|
||||||
|
"rule_set_results": {
|
||||||
|
"minimum_requirements": {...},
|
||||||
|
"intensity_zones": {...},
|
||||||
|
"training_effects": {...},
|
||||||
|
"periodization": {...},
|
||||||
|
"performance_indicators": {...},
|
||||||
|
"safety": {...}
|
||||||
|
},
|
||||||
|
"overall_score": float (0-1),
|
||||||
|
"quality_label": str,
|
||||||
|
"recommendations": [str],
|
||||||
|
"warnings": [str]
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
# No profile? Return unvalidated result
|
||||||
|
if not training_type_profile:
|
||||||
|
return self._create_unvalidated_result()
|
||||||
|
|
||||||
|
rule_sets = training_type_profile.get("rule_sets", {})
|
||||||
|
context = context or {}
|
||||||
|
|
||||||
|
results = {
|
||||||
|
"evaluated_at": datetime.now().isoformat(),
|
||||||
|
"profile_version": training_type_profile.get("version", "unknown"),
|
||||||
|
"rule_set_results": {}
|
||||||
|
}
|
||||||
|
|
||||||
|
# ━━━ 1. MINIMUM REQUIREMENTS ━━━
|
||||||
|
if "minimum_requirements" in rule_sets:
|
||||||
|
results["rule_set_results"]["minimum_requirements"] = \
|
||||||
|
self.rule_evaluator.evaluate_rule_set(
|
||||||
|
rule_sets["minimum_requirements"],
|
||||||
|
activity,
|
||||||
|
self.parameters_registry
|
||||||
|
)
|
||||||
|
|
||||||
|
# ━━━ 2. INTENSITY ZONES ━━━
|
||||||
|
if "intensity_zones" in rule_sets:
|
||||||
|
results["rule_set_results"]["intensity_zones"] = \
|
||||||
|
self.zone_evaluator.evaluate(
|
||||||
|
rule_sets["intensity_zones"],
|
||||||
|
activity,
|
||||||
|
context.get("user_profile", {})
|
||||||
|
)
|
||||||
|
|
||||||
|
# ━━━ 3. TRAINING EFFECTS ━━━
|
||||||
|
if "training_effects" in rule_sets:
|
||||||
|
results["rule_set_results"]["training_effects"] = \
|
||||||
|
self.effects_evaluator.evaluate(
|
||||||
|
rule_sets["training_effects"],
|
||||||
|
activity,
|
||||||
|
results["rule_set_results"].get("intensity_zones")
|
||||||
|
)
|
||||||
|
|
||||||
|
# ━━━ 4. PERIODIZATION ━━━
|
||||||
|
if "periodization" in rule_sets:
|
||||||
|
results["rule_set_results"]["periodization"] = \
|
||||||
|
self._evaluate_periodization(
|
||||||
|
rule_sets["periodization"],
|
||||||
|
activity,
|
||||||
|
context.get("recent_activities", [])
|
||||||
|
)
|
||||||
|
|
||||||
|
# ━━━ 5. PERFORMANCE INDICATORS ━━━
|
||||||
|
if "performance_indicators" in rule_sets:
|
||||||
|
results["rule_set_results"]["performance_indicators"] = \
|
||||||
|
self._evaluate_performance(
|
||||||
|
rule_sets["performance_indicators"],
|
||||||
|
activity,
|
||||||
|
context.get("historical_activities", [])
|
||||||
|
)
|
||||||
|
|
||||||
|
# ━━━ 6. SAFETY WARNINGS ━━━
|
||||||
|
if "safety" in rule_sets:
|
||||||
|
results["rule_set_results"]["safety"] = \
|
||||||
|
self._evaluate_safety(
|
||||||
|
rule_sets["safety"],
|
||||||
|
activity
|
||||||
|
)
|
||||||
|
|
||||||
|
# ━━━ OVERALL SCORE & QUALITY LABEL ━━━
|
||||||
|
overall_score = self._calculate_overall_score(results["rule_set_results"])
|
||||||
|
results["overall_score"] = overall_score
|
||||||
|
results["quality_label"] = self._get_quality_label(overall_score)
|
||||||
|
|
||||||
|
# ━━━ RECOMMENDATIONS & WARNINGS ━━━
|
||||||
|
results["recommendations"] = self._generate_recommendations(results)
|
||||||
|
results["warnings"] = self._collect_warnings(results)
|
||||||
|
|
||||||
|
return results
|
||||||
|
|
||||||
|
def _create_unvalidated_result(self) -> Dict:
|
||||||
|
"""Creates result for activities without profile."""
|
||||||
|
return {
|
||||||
|
"evaluated_at": datetime.now().isoformat(),
|
||||||
|
"profile_version": None,
|
||||||
|
"rule_set_results": {},
|
||||||
|
"overall_score": None,
|
||||||
|
"quality_label": None,
|
||||||
|
"recommendations": ["Kein Trainingsprofil konfiguriert"],
|
||||||
|
"warnings": []
|
||||||
|
}
|
||||||
|
|
||||||
|
def _evaluate_periodization(
|
||||||
|
self,
|
||||||
|
config: Dict,
|
||||||
|
activity: Dict,
|
||||||
|
recent_activities: List[Dict]
|
||||||
|
) -> Dict:
|
||||||
|
"""
|
||||||
|
Evaluates periodization compliance (frequency & recovery).
|
||||||
|
|
||||||
|
Simplified for MVP - full implementation later.
|
||||||
|
"""
|
||||||
|
if not config.get("enabled", False):
|
||||||
|
return {"enabled": False}
|
||||||
|
|
||||||
|
# Basic frequency check
|
||||||
|
training_type_id = activity.get("training_type_id")
|
||||||
|
same_type_this_week = sum(
|
||||||
|
1 for a in recent_activities
|
||||||
|
if a.get("training_type_id") == training_type_id
|
||||||
|
)
|
||||||
|
|
||||||
|
frequency_config = config.get("frequency", {})
|
||||||
|
optimal = frequency_config.get("per_week_optimal", 3)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"enabled": True,
|
||||||
|
"weekly_count": same_type_this_week,
|
||||||
|
"optimal_count": optimal,
|
||||||
|
"frequency_status": "optimal" if same_type_this_week <= optimal else "over_optimal",
|
||||||
|
"recovery_adequate": True, # Simplified for MVP
|
||||||
|
"warning": None
|
||||||
|
}
|
||||||
|
|
||||||
|
def _evaluate_performance(
|
||||||
|
self,
|
||||||
|
config: Dict,
|
||||||
|
activity: Dict,
|
||||||
|
historical_activities: List[Dict]
|
||||||
|
) -> Dict:
|
||||||
|
"""
|
||||||
|
Evaluates performance development.
|
||||||
|
|
||||||
|
Simplified for MVP - full implementation later.
|
||||||
|
"""
|
||||||
|
if not config.get("enabled", False):
|
||||||
|
return {"enabled": False}
|
||||||
|
|
||||||
|
return {
|
||||||
|
"enabled": True,
|
||||||
|
"trend": "stable", # Simplified
|
||||||
|
"metrics_comparison": {},
|
||||||
|
"benchmark_level": "intermediate"
|
||||||
|
}
|
||||||
|
|
||||||
|
def _evaluate_safety(self, config: Dict, activity: Dict) -> Dict:
|
||||||
|
"""
|
||||||
|
Evaluates safety warnings.
|
||||||
|
"""
|
||||||
|
if not config.get("enabled", False):
|
||||||
|
return {"enabled": False, "warnings": []}
|
||||||
|
|
||||||
|
warnings_config = config.get("warnings", [])
|
||||||
|
triggered_warnings = []
|
||||||
|
|
||||||
|
for warning_rule in warnings_config:
|
||||||
|
param_key = warning_rule.get("parameter")
|
||||||
|
operator = warning_rule.get("operator")
|
||||||
|
threshold = warning_rule.get("value")
|
||||||
|
severity = warning_rule.get("severity", "medium")
|
||||||
|
message = warning_rule.get("message", "")
|
||||||
|
|
||||||
|
actual_value = activity.get(param_key)
|
||||||
|
|
||||||
|
if actual_value is not None:
|
||||||
|
operator_func = RuleEvaluator.OPERATORS.get(operator)
|
||||||
|
if operator_func and operator_func(actual_value, threshold):
|
||||||
|
triggered_warnings.append({
|
||||||
|
"severity": severity,
|
||||||
|
"message": message,
|
||||||
|
"parameter": param_key,
|
||||||
|
"actual_value": actual_value,
|
||||||
|
"threshold": threshold
|
||||||
|
})
|
||||||
|
|
||||||
|
return {
|
||||||
|
"enabled": True,
|
||||||
|
"warnings": triggered_warnings
|
||||||
|
}
|
||||||
|
|
||||||
|
def _calculate_overall_score(self, rule_set_results: Dict) -> float:
|
||||||
|
"""
|
||||||
|
Calculates weighted overall score.
|
||||||
|
|
||||||
|
Weights:
|
||||||
|
- Minimum Requirements: 40%
|
||||||
|
- Intensity Zones: 20%
|
||||||
|
- Periodization: 20%
|
||||||
|
- Performance: 10%
|
||||||
|
- Training Effects: 10%
|
||||||
|
"""
|
||||||
|
weights = {
|
||||||
|
"minimum_requirements": 0.4,
|
||||||
|
"intensity_zones": 0.2,
|
||||||
|
"periodization": 0.2,
|
||||||
|
"performance_indicators": 0.1,
|
||||||
|
"training_effects": 0.1
|
||||||
|
}
|
||||||
|
|
||||||
|
total_score = 0.0
|
||||||
|
total_weight = 0.0
|
||||||
|
|
||||||
|
for rule_set_name, weight in weights.items():
|
||||||
|
result = rule_set_results.get(rule_set_name)
|
||||||
|
if result and result.get("enabled"):
|
||||||
|
score = result.get("score", 0.5)
|
||||||
|
|
||||||
|
# Special handling for different result types
|
||||||
|
if rule_set_name == "intensity_zones":
|
||||||
|
score = result.get("duration_quality", 0.5)
|
||||||
|
elif rule_set_name == "periodization":
|
||||||
|
score = 1.0 if result.get("recovery_adequate", False) else 0.5
|
||||||
|
|
||||||
|
total_score += score * weight
|
||||||
|
total_weight += weight
|
||||||
|
|
||||||
|
return round(total_score / total_weight, 2) if total_weight > 0 else 0.5
|
||||||
|
|
||||||
|
def _get_quality_label(self, score: Optional[float]) -> Optional[str]:
|
||||||
|
"""Converts score to quality label."""
|
||||||
|
if score is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
if score >= 0.9:
|
||||||
|
return "excellent"
|
||||||
|
elif score >= 0.7:
|
||||||
|
return "good"
|
||||||
|
elif score >= 0.5:
|
||||||
|
return "acceptable"
|
||||||
|
else:
|
||||||
|
return "poor"
|
||||||
|
|
||||||
|
def _generate_recommendations(self, results: Dict) -> List[str]:
|
||||||
|
"""Generates actionable recommendations."""
|
||||||
|
recommendations = []
|
||||||
|
|
||||||
|
# Check minimum requirements
|
||||||
|
min_req = results["rule_set_results"].get("minimum_requirements", {})
|
||||||
|
if min_req.get("enabled") and not min_req.get("passed"):
|
||||||
|
for failed in min_req.get("failed_rules", []):
|
||||||
|
param = failed.get("parameter")
|
||||||
|
actual = failed.get("actual_value")
|
||||||
|
expected = failed.get("expected_value")
|
||||||
|
reason = failed.get("reason", "")
|
||||||
|
symbol = failed.get("operator_symbol", "")
|
||||||
|
|
||||||
|
recommendations.append(
|
||||||
|
f"{param}: {actual} {symbol} {expected} - {reason}"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Check intensity zones
|
||||||
|
zone_result = results["rule_set_results"].get("intensity_zones", {})
|
||||||
|
if zone_result.get("enabled") and zone_result.get("recommendation"):
|
||||||
|
recommendations.append(zone_result["recommendation"])
|
||||||
|
|
||||||
|
# Default recommendation if excellent
|
||||||
|
if results.get("quality_label") == "excellent" and not recommendations:
|
||||||
|
recommendations.append("Hervorragendes Training! Weiter so.")
|
||||||
|
|
||||||
|
return recommendations
|
||||||
|
|
||||||
|
def _collect_warnings(self, results: Dict) -> List[str]:
|
||||||
|
"""Collects all warnings from safety checks."""
|
||||||
|
safety_result = results["rule_set_results"].get("safety", {})
|
||||||
|
if not safety_result.get("enabled"):
|
||||||
|
return []
|
||||||
|
|
||||||
|
warnings = []
|
||||||
|
for warning in safety_result.get("warnings", []):
|
||||||
|
severity_icon = "🔴" if warning["severity"] == "high" else "⚠️"
|
||||||
|
warnings.append(f"{severity_icon} {warning['message']}")
|
||||||
|
|
||||||
|
return warnings
|
||||||
450
backend/profile_templates.py
Normal file
450
backend/profile_templates.py
Normal file
|
|
@ -0,0 +1,450 @@
|
||||||
|
"""
|
||||||
|
Training Type Profile Templates
|
||||||
|
Pre-configured profiles for common training types.
|
||||||
|
|
||||||
|
Issue: #15
|
||||||
|
Date: 2026-03-23
|
||||||
|
"""
|
||||||
|
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
# TEMPLATE: LAUFEN (Running) - Ausdauer-fokussiert
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
TEMPLATE_RUNNING = {
|
||||||
|
"version": "1.0",
|
||||||
|
"name": "Laufen (Standard)",
|
||||||
|
"description": "Ausdauerlauf mit Herzfrequenz-Zonen",
|
||||||
|
|
||||||
|
"rule_sets": {
|
||||||
|
"minimum_requirements": {
|
||||||
|
"enabled": True,
|
||||||
|
"pass_strategy": "weighted_score",
|
||||||
|
"pass_threshold": 0.6,
|
||||||
|
"rules": [
|
||||||
|
{
|
||||||
|
"parameter": "duration_min",
|
||||||
|
"operator": "gte",
|
||||||
|
"value": 15,
|
||||||
|
"weight": 5,
|
||||||
|
"optional": False,
|
||||||
|
"reason": "Mindestens 15 Minuten für Trainingseffekt"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"parameter": "avg_hr",
|
||||||
|
"operator": "gte",
|
||||||
|
"value": 100,
|
||||||
|
"weight": 3,
|
||||||
|
"optional": False,
|
||||||
|
"reason": "Puls muss für Ausdauerreiz erhöht sein"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"parameter": "distance_km",
|
||||||
|
"operator": "gte",
|
||||||
|
"value": 1.0,
|
||||||
|
"weight": 2,
|
||||||
|
"optional": False,
|
||||||
|
"reason": "Mindestens 1 km Distanz"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
|
||||||
|
"intensity_zones": {
|
||||||
|
"enabled": True,
|
||||||
|
"zones": [
|
||||||
|
{
|
||||||
|
"id": "regeneration",
|
||||||
|
"name": "Regeneration",
|
||||||
|
"color": "#4CAF50",
|
||||||
|
"effect": "Aktive Erholung",
|
||||||
|
"target_duration_min": 30,
|
||||||
|
"rules": [
|
||||||
|
{
|
||||||
|
"parameter": "avg_hr_percent",
|
||||||
|
"operator": "between",
|
||||||
|
"value": [50, 60]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": "grundlagenausdauer",
|
||||||
|
"name": "Grundlagenausdauer",
|
||||||
|
"color": "#2196F3",
|
||||||
|
"effect": "Fettverbrennung, aerobe Basis",
|
||||||
|
"target_duration_min": 45,
|
||||||
|
"rules": [
|
||||||
|
{
|
||||||
|
"parameter": "avg_hr_percent",
|
||||||
|
"operator": "between",
|
||||||
|
"value": [60, 70]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": "entwicklungsbereich",
|
||||||
|
"name": "Entwicklungsbereich",
|
||||||
|
"color": "#FF9800",
|
||||||
|
"effect": "VO2max-Training, Laktattoleranz",
|
||||||
|
"target_duration_min": 30,
|
||||||
|
"rules": [
|
||||||
|
{
|
||||||
|
"parameter": "avg_hr_percent",
|
||||||
|
"operator": "between",
|
||||||
|
"value": [70, 80]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": "schwellentraining",
|
||||||
|
"name": "Schwellentraining",
|
||||||
|
"color": "#F44336",
|
||||||
|
"effect": "Anaerobe Schwelle, Wettkampftempo",
|
||||||
|
"target_duration_min": 20,
|
||||||
|
"rules": [
|
||||||
|
{
|
||||||
|
"parameter": "avg_hr_percent",
|
||||||
|
"operator": "between",
|
||||||
|
"value": [80, 90]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
|
||||||
|
"training_effects": {
|
||||||
|
"enabled": True,
|
||||||
|
"default_effects": {
|
||||||
|
"primary_abilities": [
|
||||||
|
{
|
||||||
|
"category": "konditionell",
|
||||||
|
"ability": "ausdauer",
|
||||||
|
"intensity": 5
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"secondary_abilities": [
|
||||||
|
{
|
||||||
|
"category": "konditionell",
|
||||||
|
"ability": "schnelligkeit",
|
||||||
|
"intensity": 2
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"category": "koordinativ",
|
||||||
|
"ability": "rhythmus",
|
||||||
|
"intensity": 3
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"category": "psychisch",
|
||||||
|
"ability": "willenskraft",
|
||||||
|
"intensity": 4
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metabolic_focus": ["aerobic", "fat_oxidation"],
|
||||||
|
"muscle_groups": ["legs", "core", "cardiovascular"]
|
||||||
|
},
|
||||||
|
|
||||||
|
"periodization": {
|
||||||
|
"enabled": True,
|
||||||
|
"frequency": {
|
||||||
|
"per_week_optimal": 3,
|
||||||
|
"per_week_max": 5
|
||||||
|
},
|
||||||
|
"recovery": {
|
||||||
|
"min_hours_between": 24
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
"performance_indicators": {
|
||||||
|
"enabled": False
|
||||||
|
},
|
||||||
|
|
||||||
|
"safety": {
|
||||||
|
"enabled": True,
|
||||||
|
"warnings": [
|
||||||
|
{
|
||||||
|
"parameter": "avg_hr_percent",
|
||||||
|
"operator": "gt",
|
||||||
|
"value": 95,
|
||||||
|
"severity": "high",
|
||||||
|
"message": "Herzfrequenz zu hoch - Überbelastungsrisiko"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"parameter": "duration_min",
|
||||||
|
"operator": "gt",
|
||||||
|
"value": 180,
|
||||||
|
"severity": "medium",
|
||||||
|
"message": "Sehr lange Einheit - achte auf Regeneration"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
# TEMPLATE: MEDITATION - Mental-fokussiert (≤ statt ≥ bei HR!)
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
TEMPLATE_MEDITATION = {
|
||||||
|
"version": "1.0",
|
||||||
|
"name": "Meditation (Standard)",
|
||||||
|
"description": "Mentales Training mit niedrigem Puls",
|
||||||
|
|
||||||
|
"rule_sets": {
|
||||||
|
"minimum_requirements": {
|
||||||
|
"enabled": True,
|
||||||
|
"pass_strategy": "weighted_score",
|
||||||
|
"pass_threshold": 0.6,
|
||||||
|
"rules": [
|
||||||
|
{
|
||||||
|
"parameter": "duration_min",
|
||||||
|
"operator": "gte",
|
||||||
|
"value": 5,
|
||||||
|
"weight": 5,
|
||||||
|
"optional": False,
|
||||||
|
"reason": "Mindestens 5 Minuten für Entspannungseffekt"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"parameter": "avg_hr",
|
||||||
|
"operator": "lte",
|
||||||
|
"value": 80,
|
||||||
|
"weight": 4,
|
||||||
|
"optional": False,
|
||||||
|
"reason": "Niedriger Puls zeigt Entspannung an"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
|
||||||
|
"intensity_zones": {
|
||||||
|
"enabled": True,
|
||||||
|
"zones": [
|
||||||
|
{
|
||||||
|
"id": "deep_relaxation",
|
||||||
|
"name": "Tiefenentspannung",
|
||||||
|
"color": "#4CAF50",
|
||||||
|
"effect": "Parasympathikus-Aktivierung",
|
||||||
|
"target_duration_min": 10,
|
||||||
|
"rules": [
|
||||||
|
{
|
||||||
|
"parameter": "avg_hr_percent",
|
||||||
|
"operator": "between",
|
||||||
|
"value": [35, 45]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": "light_meditation",
|
||||||
|
"name": "Leichte Meditation",
|
||||||
|
"color": "#2196F3",
|
||||||
|
"effect": "Achtsamkeit, Fokus",
|
||||||
|
"target_duration_min": 15,
|
||||||
|
"rules": [
|
||||||
|
{
|
||||||
|
"parameter": "avg_hr_percent",
|
||||||
|
"operator": "between",
|
||||||
|
"value": [45, 55]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
|
||||||
|
"training_effects": {
|
||||||
|
"enabled": True,
|
||||||
|
"default_effects": {
|
||||||
|
"primary_abilities": [
|
||||||
|
{
|
||||||
|
"category": "kognitiv",
|
||||||
|
"ability": "konzentration",
|
||||||
|
"intensity": 5
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"category": "psychisch",
|
||||||
|
"ability": "stressresistenz",
|
||||||
|
"intensity": 5
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"secondary_abilities": [
|
||||||
|
{
|
||||||
|
"category": "kognitiv",
|
||||||
|
"ability": "wahrnehmung",
|
||||||
|
"intensity": 4
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"category": "psychisch",
|
||||||
|
"ability": "selbstvertrauen",
|
||||||
|
"intensity": 3
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metabolic_focus": ["parasympathetic_activation"],
|
||||||
|
"muscle_groups": []
|
||||||
|
},
|
||||||
|
|
||||||
|
"periodization": {
|
||||||
|
"enabled": True,
|
||||||
|
"frequency": {
|
||||||
|
"per_week_optimal": 5,
|
||||||
|
"per_week_max": 7
|
||||||
|
},
|
||||||
|
"recovery": {
|
||||||
|
"min_hours_between": 0
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
"performance_indicators": {
|
||||||
|
"enabled": False
|
||||||
|
},
|
||||||
|
|
||||||
|
"safety": {
|
||||||
|
"enabled": True,
|
||||||
|
"warnings": [
|
||||||
|
{
|
||||||
|
"parameter": "avg_hr",
|
||||||
|
"operator": "gt",
|
||||||
|
"value": 100,
|
||||||
|
"severity": "medium",
|
||||||
|
"message": "Herzfrequenz zu hoch für Meditation - bist du wirklich entspannt?"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
# TEMPLATE: KRAFTTRAINING - Kraft-fokussiert
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
TEMPLATE_STRENGTH = {
|
||||||
|
"version": "1.0",
|
||||||
|
"name": "Krafttraining (Standard)",
|
||||||
|
"description": "Krafttraining mit moderater Herzfrequenz",
|
||||||
|
|
||||||
|
"rule_sets": {
|
||||||
|
"minimum_requirements": {
|
||||||
|
"enabled": True,
|
||||||
|
"pass_strategy": "weighted_score",
|
||||||
|
"pass_threshold": 0.5,
|
||||||
|
"rules": [
|
||||||
|
{
|
||||||
|
"parameter": "duration_min",
|
||||||
|
"operator": "gte",
|
||||||
|
"value": 20,
|
||||||
|
"weight": 5,
|
||||||
|
"optional": False,
|
||||||
|
"reason": "Mindestens 20 Minuten für Muskelreiz"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"parameter": "kcal_active",
|
||||||
|
"operator": "gte",
|
||||||
|
"value": 100,
|
||||||
|
"weight": 2,
|
||||||
|
"optional": True,
|
||||||
|
"reason": "Mindest-Kalorienverbrauch"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
|
||||||
|
"intensity_zones": {
|
||||||
|
"enabled": False
|
||||||
|
},
|
||||||
|
|
||||||
|
"training_effects": {
|
||||||
|
"enabled": True,
|
||||||
|
"default_effects": {
|
||||||
|
"primary_abilities": [
|
||||||
|
{
|
||||||
|
"category": "konditionell",
|
||||||
|
"ability": "kraft",
|
||||||
|
"intensity": 5
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"secondary_abilities": [
|
||||||
|
{
|
||||||
|
"category": "koordinativ",
|
||||||
|
"ability": "differenzierung",
|
||||||
|
"intensity": 3
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"category": "psychisch",
|
||||||
|
"ability": "willenskraft",
|
||||||
|
"intensity": 4
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metabolic_focus": ["anaerobic", "muscle_growth"],
|
||||||
|
"muscle_groups": ["full_body"]
|
||||||
|
},
|
||||||
|
|
||||||
|
"periodization": {
|
||||||
|
"enabled": True,
|
||||||
|
"frequency": {
|
||||||
|
"per_week_optimal": 3,
|
||||||
|
"per_week_max": 5
|
||||||
|
},
|
||||||
|
"recovery": {
|
||||||
|
"min_hours_between": 48
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
"performance_indicators": {
|
||||||
|
"enabled": False
|
||||||
|
},
|
||||||
|
|
||||||
|
"safety": {
|
||||||
|
"enabled": True,
|
||||||
|
"warnings": []
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
# TEMPLATE REGISTRY
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
TEMPLATES = {
|
||||||
|
"running": {
|
||||||
|
"name_de": "Laufen",
|
||||||
|
"name_en": "Running",
|
||||||
|
"icon": "🏃",
|
||||||
|
"categories": ["cardio", "running"],
|
||||||
|
"template": TEMPLATE_RUNNING
|
||||||
|
},
|
||||||
|
"meditation": {
|
||||||
|
"name_de": "Meditation",
|
||||||
|
"name_en": "Meditation",
|
||||||
|
"icon": "🧘",
|
||||||
|
"categories": ["geist", "meditation"],
|
||||||
|
"template": TEMPLATE_MEDITATION
|
||||||
|
},
|
||||||
|
"strength": {
|
||||||
|
"name_de": "Krafttraining",
|
||||||
|
"name_en": "Strength Training",
|
||||||
|
"icon": "💪",
|
||||||
|
"categories": ["kraft", "krafttraining"],
|
||||||
|
"template": TEMPLATE_STRENGTH
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def get_template(template_key: str) -> dict:
|
||||||
|
"""Get profile template by key."""
|
||||||
|
template_info = TEMPLATES.get(template_key)
|
||||||
|
if not template_info:
|
||||||
|
return None
|
||||||
|
return template_info["template"]
|
||||||
|
|
||||||
|
|
||||||
|
def list_templates() -> list:
|
||||||
|
"""List all available templates."""
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
"key": key,
|
||||||
|
"name_de": info["name_de"],
|
||||||
|
"name_en": info["name_en"],
|
||||||
|
"icon": info["icon"],
|
||||||
|
"categories": info["categories"]
|
||||||
|
}
|
||||||
|
for key, info in TEMPLATES.items()
|
||||||
|
]
|
||||||
526
backend/prompt_executor.py
Normal file
526
backend/prompt_executor.py
Normal file
|
|
@ -0,0 +1,526 @@
|
||||||
|
"""
|
||||||
|
Unified Prompt Executor (Issue #28 Phase 2)
|
||||||
|
|
||||||
|
Executes both base and pipeline-type prompts with:
|
||||||
|
- Dynamic placeholder resolution
|
||||||
|
- JSON output validation
|
||||||
|
- Multi-stage parallel execution
|
||||||
|
- Reference and inline prompt support
|
||||||
|
"""
|
||||||
|
import json
|
||||||
|
import re
|
||||||
|
from typing import Dict, Any, Optional
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from fastapi import HTTPException
|
||||||
|
|
||||||
|
|
||||||
|
def resolve_placeholders(template: str, variables: Dict[str, Any], debug_info: Optional[Dict] = None, catalog: Optional[Dict] = None) -> str:
|
||||||
|
"""
|
||||||
|
Replace {{placeholder}} with values from variables dict.
|
||||||
|
|
||||||
|
Supports modifiers:
|
||||||
|
- {{key|d}} - Include description in parentheses (requires catalog)
|
||||||
|
|
||||||
|
Args:
|
||||||
|
template: String with {{key}} or {{key|modifiers}} placeholders
|
||||||
|
variables: Dict of key -> value mappings
|
||||||
|
debug_info: Optional dict to collect debug information
|
||||||
|
catalog: Optional placeholder catalog for descriptions (from get_placeholder_catalog)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Template with placeholders replaced
|
||||||
|
"""
|
||||||
|
resolved = {}
|
||||||
|
unresolved = []
|
||||||
|
|
||||||
|
def replacer(match):
|
||||||
|
full_placeholder = match.group(1).strip()
|
||||||
|
|
||||||
|
# Parse key and modifiers (e.g., "weight_aktuell|d" -> key="weight_aktuell", modifiers="d")
|
||||||
|
parts = full_placeholder.split('|')
|
||||||
|
key = parts[0].strip()
|
||||||
|
modifiers = parts[1].strip() if len(parts) > 1 else ''
|
||||||
|
|
||||||
|
if key in variables:
|
||||||
|
value = variables[key]
|
||||||
|
# Convert dict/list to JSON string
|
||||||
|
if isinstance(value, (dict, list)):
|
||||||
|
resolved_value = json.dumps(value, ensure_ascii=False)
|
||||||
|
else:
|
||||||
|
resolved_value = str(value)
|
||||||
|
|
||||||
|
# Apply modifiers
|
||||||
|
if 'd' in modifiers:
|
||||||
|
if catalog:
|
||||||
|
# Add description from catalog
|
||||||
|
description = None
|
||||||
|
for cat_items in catalog.values():
|
||||||
|
matching = [item for item in cat_items if item['key'] == key]
|
||||||
|
if matching:
|
||||||
|
description = matching[0].get('description', '')
|
||||||
|
break
|
||||||
|
|
||||||
|
if description:
|
||||||
|
resolved_value = f"{resolved_value} ({description})"
|
||||||
|
else:
|
||||||
|
# Catalog not available - log warning in debug
|
||||||
|
if debug_info is not None:
|
||||||
|
if 'warnings' not in debug_info:
|
||||||
|
debug_info['warnings'] = []
|
||||||
|
debug_info['warnings'].append(f"Modifier |d used but catalog not available for {key}")
|
||||||
|
|
||||||
|
# Track resolution for debug
|
||||||
|
if debug_info is not None:
|
||||||
|
resolved[key] = resolved_value[:100] + ('...' if len(resolved_value) > 100 else '')
|
||||||
|
|
||||||
|
return resolved_value
|
||||||
|
else:
|
||||||
|
# Keep placeholder if no value found
|
||||||
|
if debug_info is not None:
|
||||||
|
unresolved.append(key)
|
||||||
|
return match.group(0)
|
||||||
|
|
||||||
|
result = re.sub(r'\{\{([^}]+)\}\}', replacer, template)
|
||||||
|
|
||||||
|
# Store debug info
|
||||||
|
if debug_info is not None:
|
||||||
|
debug_info['resolved_placeholders'] = resolved
|
||||||
|
debug_info['unresolved_placeholders'] = unresolved
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def validate_json_output(output: str, schema: Optional[Dict] = None, debug_info: Optional[Dict] = None) -> Dict:
|
||||||
|
"""
|
||||||
|
Validate that output is valid JSON.
|
||||||
|
|
||||||
|
Unwraps Markdown-wrapped JSON (```json ... ```) if present.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
output: String to validate
|
||||||
|
schema: Optional JSON schema to validate against (TODO: jsonschema library)
|
||||||
|
debug_info: Optional dict to attach to error for debugging
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Parsed JSON dict
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
HTTPException: If output is not valid JSON (with debug info attached)
|
||||||
|
"""
|
||||||
|
# Try to unwrap Markdown code blocks (common AI pattern)
|
||||||
|
unwrapped = output.strip()
|
||||||
|
if unwrapped.startswith('```json'):
|
||||||
|
# Extract content between ```json and ```
|
||||||
|
lines = unwrapped.split('\n')
|
||||||
|
if len(lines) > 2 and lines[-1].strip() == '```':
|
||||||
|
unwrapped = '\n'.join(lines[1:-1])
|
||||||
|
elif unwrapped.startswith('```'):
|
||||||
|
# Generic code block
|
||||||
|
lines = unwrapped.split('\n')
|
||||||
|
if len(lines) > 2 and lines[-1].strip() == '```':
|
||||||
|
unwrapped = '\n'.join(lines[1:-1])
|
||||||
|
|
||||||
|
try:
|
||||||
|
parsed = json.loads(unwrapped)
|
||||||
|
# TODO: Add jsonschema validation if schema provided
|
||||||
|
return parsed
|
||||||
|
except json.JSONDecodeError as e:
|
||||||
|
error_detail = {
|
||||||
|
"error": f"AI returned invalid JSON: {str(e)}",
|
||||||
|
"raw_output": output[:500] + ('...' if len(output) > 500 else ''),
|
||||||
|
"unwrapped": unwrapped[:500] if unwrapped != output else None,
|
||||||
|
"output_length": len(output)
|
||||||
|
}
|
||||||
|
if debug_info:
|
||||||
|
error_detail["debug"] = debug_info
|
||||||
|
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=500,
|
||||||
|
detail=error_detail
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
async def execute_prompt(
|
||||||
|
prompt_slug: str,
|
||||||
|
variables: Dict[str, Any],
|
||||||
|
openrouter_call_func,
|
||||||
|
enable_debug: bool = False
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Execute a single prompt (base or pipeline type).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
prompt_slug: Slug of prompt to execute
|
||||||
|
variables: Dict of variables for placeholder replacement
|
||||||
|
openrouter_call_func: Async function(prompt_text) -> response_text
|
||||||
|
enable_debug: If True, include debug information in response
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dict with execution results:
|
||||||
|
{
|
||||||
|
"type": "base" | "pipeline",
|
||||||
|
"slug": "...",
|
||||||
|
"output": "..." | {...}, # String or parsed JSON
|
||||||
|
"stages": [...] # Only for pipeline type
|
||||||
|
"debug": {...} # Only if enable_debug=True
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
# Load prompt from database
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT * FROM ai_prompts
|
||||||
|
WHERE slug = %s AND active = true""",
|
||||||
|
(prompt_slug,)
|
||||||
|
)
|
||||||
|
row = cur.fetchone()
|
||||||
|
if not row:
|
||||||
|
raise HTTPException(404, f"Prompt nicht gefunden: {prompt_slug}")
|
||||||
|
|
||||||
|
prompt = r2d(row)
|
||||||
|
|
||||||
|
prompt_type = prompt.get('type', 'pipeline')
|
||||||
|
|
||||||
|
# Get catalog from variables if available (passed from execute_prompt_with_data)
|
||||||
|
catalog = variables.pop('_catalog', None) if '_catalog' in variables else None
|
||||||
|
|
||||||
|
if prompt_type == 'base':
|
||||||
|
# Base prompt: single execution with template
|
||||||
|
return await execute_base_prompt(prompt, variables, openrouter_call_func, enable_debug, catalog)
|
||||||
|
|
||||||
|
elif prompt_type == 'pipeline':
|
||||||
|
# Pipeline prompt: multi-stage execution
|
||||||
|
return await execute_pipeline_prompt(prompt, variables, openrouter_call_func, enable_debug, catalog)
|
||||||
|
|
||||||
|
else:
|
||||||
|
raise HTTPException(400, f"Unknown prompt type: {prompt_type}")
|
||||||
|
|
||||||
|
|
||||||
|
async def execute_base_prompt(
|
||||||
|
prompt: Dict,
|
||||||
|
variables: Dict[str, Any],
|
||||||
|
openrouter_call_func,
|
||||||
|
enable_debug: bool = False,
|
||||||
|
catalog: Optional[Dict] = None
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""Execute a base-type prompt (single template)."""
|
||||||
|
template = prompt.get('template')
|
||||||
|
if not template:
|
||||||
|
raise HTTPException(400, f"Base prompt missing template: {prompt['slug']}")
|
||||||
|
|
||||||
|
debug_info = {} if enable_debug else None
|
||||||
|
|
||||||
|
# Resolve placeholders (with optional catalog for |d modifier)
|
||||||
|
prompt_text = resolve_placeholders(template, variables, debug_info, catalog)
|
||||||
|
|
||||||
|
if enable_debug:
|
||||||
|
debug_info['template'] = template
|
||||||
|
debug_info['final_prompt'] = prompt_text[:500] + ('...' if len(prompt_text) > 500 else '')
|
||||||
|
debug_info['available_variables'] = list(variables.keys())
|
||||||
|
|
||||||
|
# Call AI
|
||||||
|
response = await openrouter_call_func(prompt_text)
|
||||||
|
|
||||||
|
if enable_debug:
|
||||||
|
debug_info['ai_response_length'] = len(response)
|
||||||
|
debug_info['ai_response_preview'] = response[:200] + ('...' if len(response) > 200 else '')
|
||||||
|
|
||||||
|
# Validate JSON if required
|
||||||
|
output_format = prompt.get('output_format', 'text')
|
||||||
|
if output_format == 'json':
|
||||||
|
output = validate_json_output(response, prompt.get('output_schema'), debug_info if enable_debug else None)
|
||||||
|
else:
|
||||||
|
output = response
|
||||||
|
|
||||||
|
result = {
|
||||||
|
"type": "base",
|
||||||
|
"slug": prompt['slug'],
|
||||||
|
"output": output,
|
||||||
|
"output_format": output_format
|
||||||
|
}
|
||||||
|
|
||||||
|
if enable_debug:
|
||||||
|
result['debug'] = debug_info
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
async def execute_pipeline_prompt(
|
||||||
|
prompt: Dict,
|
||||||
|
variables: Dict[str, Any],
|
||||||
|
openrouter_call_func,
|
||||||
|
enable_debug: bool = False,
|
||||||
|
catalog: Optional[Dict] = None
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Execute a pipeline-type prompt (multi-stage).
|
||||||
|
|
||||||
|
Each stage's results are added to variables for next stage.
|
||||||
|
"""
|
||||||
|
stages = prompt.get('stages')
|
||||||
|
if not stages:
|
||||||
|
raise HTTPException(400, f"Pipeline prompt missing stages: {prompt['slug']}")
|
||||||
|
|
||||||
|
# Parse stages if stored as JSON string
|
||||||
|
if isinstance(stages, str):
|
||||||
|
stages = json.loads(stages)
|
||||||
|
|
||||||
|
stage_results = []
|
||||||
|
context_vars = variables.copy()
|
||||||
|
pipeline_debug = [] if enable_debug else None
|
||||||
|
|
||||||
|
# Execute stages in order
|
||||||
|
for stage_def in sorted(stages, key=lambda s: s['stage']):
|
||||||
|
stage_num = stage_def['stage']
|
||||||
|
stage_prompts = stage_def.get('prompts', [])
|
||||||
|
|
||||||
|
if not stage_prompts:
|
||||||
|
continue
|
||||||
|
|
||||||
|
stage_debug = {} if enable_debug else None
|
||||||
|
if enable_debug:
|
||||||
|
stage_debug['stage'] = stage_num
|
||||||
|
stage_debug['available_variables'] = list(context_vars.keys())
|
||||||
|
stage_debug['prompts'] = []
|
||||||
|
|
||||||
|
# Execute all prompts in this stage (parallel concept, sequential impl for now)
|
||||||
|
stage_outputs = {}
|
||||||
|
|
||||||
|
for prompt_def in stage_prompts:
|
||||||
|
source = prompt_def.get('source')
|
||||||
|
output_key = prompt_def.get('output_key', f'stage{stage_num}')
|
||||||
|
output_format = prompt_def.get('output_format', 'text')
|
||||||
|
|
||||||
|
prompt_debug = {} if enable_debug else None
|
||||||
|
|
||||||
|
if source == 'reference':
|
||||||
|
# Reference to another prompt
|
||||||
|
ref_slug = prompt_def.get('slug')
|
||||||
|
if not ref_slug:
|
||||||
|
raise HTTPException(400, f"Reference prompt missing slug in stage {stage_num}")
|
||||||
|
|
||||||
|
if enable_debug:
|
||||||
|
prompt_debug['source'] = 'reference'
|
||||||
|
prompt_debug['ref_slug'] = ref_slug
|
||||||
|
|
||||||
|
# Load referenced prompt
|
||||||
|
result = await execute_prompt(ref_slug, context_vars, openrouter_call_func, enable_debug)
|
||||||
|
output = result['output']
|
||||||
|
|
||||||
|
if enable_debug and 'debug' in result:
|
||||||
|
prompt_debug['ref_debug'] = result['debug']
|
||||||
|
|
||||||
|
elif source == 'inline':
|
||||||
|
# Inline template
|
||||||
|
template = prompt_def.get('template')
|
||||||
|
if not template:
|
||||||
|
raise HTTPException(400, f"Inline prompt missing template in stage {stage_num}")
|
||||||
|
|
||||||
|
placeholder_debug = {} if enable_debug else None
|
||||||
|
prompt_text = resolve_placeholders(template, context_vars, placeholder_debug, catalog)
|
||||||
|
|
||||||
|
if enable_debug:
|
||||||
|
prompt_debug['source'] = 'inline'
|
||||||
|
prompt_debug['template'] = template
|
||||||
|
prompt_debug['final_prompt'] = prompt_text[:500] + ('...' if len(prompt_text) > 500 else '')
|
||||||
|
prompt_debug.update(placeholder_debug)
|
||||||
|
|
||||||
|
response = await openrouter_call_func(prompt_text)
|
||||||
|
|
||||||
|
if enable_debug:
|
||||||
|
prompt_debug['ai_response_length'] = len(response)
|
||||||
|
prompt_debug['ai_response_preview'] = response[:200] + ('...' if len(response) > 200 else '')
|
||||||
|
|
||||||
|
# Validate JSON if required
|
||||||
|
if output_format == 'json':
|
||||||
|
output = validate_json_output(response, prompt_def.get('output_schema'), prompt_debug if enable_debug else None)
|
||||||
|
else:
|
||||||
|
output = response
|
||||||
|
|
||||||
|
else:
|
||||||
|
raise HTTPException(400, f"Unknown prompt source: {source}")
|
||||||
|
|
||||||
|
# Store output with key
|
||||||
|
stage_outputs[output_key] = output
|
||||||
|
|
||||||
|
# Add to context for next stage
|
||||||
|
context_var_key = f'stage_{stage_num}_{output_key}'
|
||||||
|
context_vars[context_var_key] = output
|
||||||
|
|
||||||
|
if enable_debug:
|
||||||
|
prompt_debug['output_key'] = output_key
|
||||||
|
prompt_debug['context_var_key'] = context_var_key
|
||||||
|
stage_debug['prompts'].append(prompt_debug)
|
||||||
|
|
||||||
|
stage_results.append({
|
||||||
|
"stage": stage_num,
|
||||||
|
"outputs": stage_outputs
|
||||||
|
})
|
||||||
|
|
||||||
|
if enable_debug:
|
||||||
|
stage_debug['output'] = stage_outputs # Add outputs to debug info for value table
|
||||||
|
pipeline_debug.append(stage_debug)
|
||||||
|
|
||||||
|
# Final output is last stage's first output
|
||||||
|
final_output = stage_results[-1]['outputs'] if stage_results else {}
|
||||||
|
|
||||||
|
result = {
|
||||||
|
"type": "pipeline",
|
||||||
|
"slug": prompt['slug'],
|
||||||
|
"stages": stage_results,
|
||||||
|
"output": final_output,
|
||||||
|
"output_format": prompt.get('output_format', 'text')
|
||||||
|
}
|
||||||
|
|
||||||
|
if enable_debug:
|
||||||
|
result['debug'] = {
|
||||||
|
'initial_variables': list(variables.keys()),
|
||||||
|
'stages': pipeline_debug
|
||||||
|
}
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
async def execute_prompt_with_data(
|
||||||
|
prompt_slug: str,
|
||||||
|
profile_id: str,
|
||||||
|
modules: Optional[Dict[str, bool]] = None,
|
||||||
|
timeframes: Optional[Dict[str, int]] = None,
|
||||||
|
openrouter_call_func = None,
|
||||||
|
enable_debug: bool = False
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Execute prompt with data loaded from database.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
prompt_slug: Slug of prompt to execute
|
||||||
|
profile_id: User profile ID
|
||||||
|
modules: Dict of module -> enabled (e.g., {"körper": true})
|
||||||
|
timeframes: Dict of module -> days (e.g., {"körper": 30})
|
||||||
|
openrouter_call_func: Async function for AI calls
|
||||||
|
enable_debug: If True, include debug information in response
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Execution result dict
|
||||||
|
"""
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from placeholder_resolver import get_placeholder_example_values, get_placeholder_catalog
|
||||||
|
|
||||||
|
# Build variables from data modules
|
||||||
|
variables = {
|
||||||
|
'profile_id': profile_id,
|
||||||
|
'today': datetime.now().strftime('%Y-%m-%d')
|
||||||
|
}
|
||||||
|
|
||||||
|
# Load placeholder catalog for |d modifier support
|
||||||
|
try:
|
||||||
|
catalog = get_placeholder_catalog(profile_id)
|
||||||
|
except Exception as e:
|
||||||
|
catalog = None
|
||||||
|
print(f"Warning: Could not load placeholder catalog: {e}")
|
||||||
|
|
||||||
|
variables['_catalog'] = catalog # Will be popped in execute_prompt (can be None)
|
||||||
|
|
||||||
|
# Add PROCESSED placeholders (name, weight_trend, caliper_summary, etc.)
|
||||||
|
# This makes old-style prompts work with the new executor
|
||||||
|
try:
|
||||||
|
processed_placeholders = get_placeholder_example_values(profile_id)
|
||||||
|
# Remove {{ }} from keys (placeholder_resolver returns them with wrappers)
|
||||||
|
cleaned_placeholders = {
|
||||||
|
key.replace('{{', '').replace('}}', ''): value
|
||||||
|
for key, value in processed_placeholders.items()
|
||||||
|
}
|
||||||
|
variables.update(cleaned_placeholders)
|
||||||
|
except Exception as e:
|
||||||
|
# Continue even if placeholder resolution fails
|
||||||
|
if enable_debug:
|
||||||
|
variables['_placeholder_error'] = str(e)
|
||||||
|
|
||||||
|
# Load data for enabled modules
|
||||||
|
if modules:
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Weight data
|
||||||
|
if modules.get('körper'):
|
||||||
|
days = timeframes.get('körper', 30)
|
||||||
|
since = (datetime.now() - timedelta(days=days)).strftime('%Y-%m-%d')
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT date, weight FROM weight_log
|
||||||
|
WHERE profile_id = %s AND date >= %s
|
||||||
|
ORDER BY date DESC""",
|
||||||
|
(profile_id, since)
|
||||||
|
)
|
||||||
|
variables['weight_data'] = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
# Nutrition data
|
||||||
|
if modules.get('ernährung'):
|
||||||
|
days = timeframes.get('ernährung', 30)
|
||||||
|
since = (datetime.now() - timedelta(days=days)).strftime('%Y-%m-%d')
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT date, kcal, protein_g, fat_g, carbs_g
|
||||||
|
FROM nutrition_log
|
||||||
|
WHERE profile_id = %s AND date >= %s
|
||||||
|
ORDER BY date DESC""",
|
||||||
|
(profile_id, since)
|
||||||
|
)
|
||||||
|
variables['nutrition_data'] = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
# Activity data
|
||||||
|
if modules.get('training'):
|
||||||
|
days = timeframes.get('training', 14)
|
||||||
|
since = (datetime.now() - timedelta(days=days)).strftime('%Y-%m-%d')
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT date, activity_type, duration_min, kcal_active, hr_avg
|
||||||
|
FROM activity_log
|
||||||
|
WHERE profile_id = %s AND date >= %s
|
||||||
|
ORDER BY date DESC""",
|
||||||
|
(profile_id, since)
|
||||||
|
)
|
||||||
|
variables['activity_data'] = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
# Sleep data
|
||||||
|
if modules.get('schlaf'):
|
||||||
|
days = timeframes.get('schlaf', 14)
|
||||||
|
since = (datetime.now() - timedelta(days=days)).strftime('%Y-%m-%d')
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT date, sleep_segments, source
|
||||||
|
FROM sleep_log
|
||||||
|
WHERE profile_id = %s AND date >= %s
|
||||||
|
ORDER BY date DESC""",
|
||||||
|
(profile_id, since)
|
||||||
|
)
|
||||||
|
variables['sleep_data'] = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
# Vitals data
|
||||||
|
if modules.get('vitalwerte'):
|
||||||
|
days = timeframes.get('vitalwerte', 7)
|
||||||
|
since = (datetime.now() - timedelta(days=days)).strftime('%Y-%m-%d')
|
||||||
|
|
||||||
|
# Baseline vitals
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT date, resting_hr, hrv, vo2_max, spo2, respiratory_rate
|
||||||
|
FROM vitals_baseline
|
||||||
|
WHERE profile_id = %s AND date >= %s
|
||||||
|
ORDER BY date DESC""",
|
||||||
|
(profile_id, since)
|
||||||
|
)
|
||||||
|
variables['vitals_baseline'] = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
# Blood pressure
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT measured_at, systolic, diastolic, pulse
|
||||||
|
FROM blood_pressure_log
|
||||||
|
WHERE profile_id = %s AND measured_at >= %s
|
||||||
|
ORDER BY measured_at DESC""",
|
||||||
|
(profile_id, since + ' 00:00:00')
|
||||||
|
)
|
||||||
|
variables['blood_pressure'] = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
# Mental/Goals (no timeframe, just current state)
|
||||||
|
if modules.get('mentales') or modules.get('ziele'):
|
||||||
|
# TODO: Add mental state / goals data when implemented
|
||||||
|
variables['goals_data'] = []
|
||||||
|
|
||||||
|
# Execute prompt
|
||||||
|
return await execute_prompt(prompt_slug, variables, openrouter_call_func, enable_debug)
|
||||||
125
backend/quality_filter.py
Normal file
125
backend/quality_filter.py
Normal file
|
|
@ -0,0 +1,125 @@
|
||||||
|
"""
|
||||||
|
Quality Filter Helper - Data Access Layer
|
||||||
|
|
||||||
|
Provides consistent quality filtering across all activity queries.
|
||||||
|
Issue: #31
|
||||||
|
"""
|
||||||
|
from typing import Optional, Dict
|
||||||
|
|
||||||
|
|
||||||
|
def get_quality_filter_sql(profile: Dict, table_alias: str = "") -> str:
|
||||||
|
"""
|
||||||
|
Returns SQL WHERE clause fragment for quality filtering.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
profile: User profile dict with quality_filter_level
|
||||||
|
table_alias: Optional table alias (e.g., "a." for "a.quality_label")
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
SQL fragment (e.g., "AND quality_label IN (...)") or empty string
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
>>> get_quality_filter_sql({'quality_filter_level': 'all'})
|
||||||
|
''
|
||||||
|
>>> get_quality_filter_sql({'quality_filter_level': 'quality'})
|
||||||
|
"AND quality_label IN ('excellent', 'good', 'acceptable')"
|
||||||
|
>>> get_quality_filter_sql({'quality_filter_level': 'excellent'}, 'a.')
|
||||||
|
"AND a.quality_label = 'excellent'"
|
||||||
|
"""
|
||||||
|
level = profile.get('quality_filter_level', 'all')
|
||||||
|
prefix = table_alias if table_alias else ""
|
||||||
|
|
||||||
|
if level == 'all':
|
||||||
|
return '' # No filter
|
||||||
|
elif level == 'quality':
|
||||||
|
return f"AND {prefix}quality_label IN ('excellent', 'good', 'acceptable')"
|
||||||
|
elif level == 'very_good':
|
||||||
|
return f"AND {prefix}quality_label IN ('excellent', 'good')"
|
||||||
|
elif level == 'excellent':
|
||||||
|
return f"AND {prefix}quality_label = 'excellent'"
|
||||||
|
else:
|
||||||
|
# Unknown level → no filter (safe fallback)
|
||||||
|
return ''
|
||||||
|
|
||||||
|
|
||||||
|
def get_quality_filter_tuple(profile: Dict) -> tuple:
|
||||||
|
"""
|
||||||
|
Returns tuple of allowed quality labels for Python filtering.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
profile: User profile dict with quality_filter_level
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Tuple of allowed quality labels or None (no filter)
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
>>> get_quality_filter_tuple({'quality_filter_level': 'all'})
|
||||||
|
None
|
||||||
|
>>> get_quality_filter_tuple({'quality_filter_level': 'quality'})
|
||||||
|
('excellent', 'good', 'acceptable')
|
||||||
|
"""
|
||||||
|
level = profile.get('quality_filter_level', 'all')
|
||||||
|
|
||||||
|
if level == 'all':
|
||||||
|
return None # No filter
|
||||||
|
elif level == 'quality':
|
||||||
|
return ('excellent', 'good', 'acceptable')
|
||||||
|
elif level == 'very_good':
|
||||||
|
return ('excellent', 'good')
|
||||||
|
elif level == 'excellent':
|
||||||
|
return ('excellent',)
|
||||||
|
else:
|
||||||
|
return None # Unknown level → no filter
|
||||||
|
|
||||||
|
|
||||||
|
def filter_activities_by_quality(activities: list, profile: Dict) -> list:
|
||||||
|
"""
|
||||||
|
Filters a list of activity dicts by quality_label.
|
||||||
|
|
||||||
|
Useful for post-query filtering (e.g., when data already loaded).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
activities: List of activity dicts with quality_label field
|
||||||
|
profile: User profile dict with quality_filter_level
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Filtered list of activities
|
||||||
|
"""
|
||||||
|
allowed_labels = get_quality_filter_tuple(profile)
|
||||||
|
|
||||||
|
if allowed_labels is None:
|
||||||
|
return activities # No filter
|
||||||
|
|
||||||
|
return [
|
||||||
|
act for act in activities
|
||||||
|
if act.get('quality_label') in allowed_labels
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
# Constants for frontend/documentation
|
||||||
|
QUALITY_LEVELS = {
|
||||||
|
'all': {
|
||||||
|
'label': 'Alle',
|
||||||
|
'icon': '📊',
|
||||||
|
'description': 'Alle Activities (kein Filter)',
|
||||||
|
'includes': None
|
||||||
|
},
|
||||||
|
'quality': {
|
||||||
|
'label': 'Hochwertig',
|
||||||
|
'icon': '✓',
|
||||||
|
'description': 'Hochwertige Activities',
|
||||||
|
'includes': ['excellent', 'good', 'acceptable']
|
||||||
|
},
|
||||||
|
'very_good': {
|
||||||
|
'label': 'Sehr gut',
|
||||||
|
'icon': '✓✓',
|
||||||
|
'description': 'Sehr gute Activities',
|
||||||
|
'includes': ['excellent', 'good']
|
||||||
|
},
|
||||||
|
'excellent': {
|
||||||
|
'label': 'Exzellent',
|
||||||
|
'icon': '⭐',
|
||||||
|
'description': 'Nur exzellente Activities',
|
||||||
|
'includes': ['excellent']
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
@ -8,3 +8,4 @@ pydantic==2.7.1
|
||||||
bcrypt==4.1.3
|
bcrypt==4.1.3
|
||||||
slowapi==0.1.9
|
slowapi==0.1.9
|
||||||
psycopg2-binary==2.9.9
|
psycopg2-binary==2.9.9
|
||||||
|
python-dateutil==2.9.0
|
||||||
|
|
|
||||||
0
backend/routers/__init__.py
Normal file
0
backend/routers/__init__.py
Normal file
192
backend/routers/access_grants.py
Normal file
192
backend/routers/access_grants.py
Normal file
|
|
@ -0,0 +1,192 @@
|
||||||
|
"""
|
||||||
|
Access Grants Management Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
Admin-only access grants history and manual grant creation.
|
||||||
|
"""
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from fastapi import APIRouter, HTTPException, Depends
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_admin
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/access-grants", tags=["access-grants"])
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def list_access_grants(
|
||||||
|
profile_id: str = None,
|
||||||
|
active_only: bool = False,
|
||||||
|
session: dict = Depends(require_admin)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Admin: List access grants.
|
||||||
|
|
||||||
|
Query params:
|
||||||
|
- profile_id: Filter by user
|
||||||
|
- active_only: Only show currently active grants
|
||||||
|
"""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
query = """
|
||||||
|
SELECT
|
||||||
|
ag.*,
|
||||||
|
t.name as tier_name,
|
||||||
|
p.name as profile_name,
|
||||||
|
p.email as profile_email
|
||||||
|
FROM access_grants ag
|
||||||
|
JOIN tiers t ON t.id = ag.tier_id
|
||||||
|
JOIN profiles p ON p.id = ag.profile_id
|
||||||
|
"""
|
||||||
|
|
||||||
|
conditions = []
|
||||||
|
params = []
|
||||||
|
|
||||||
|
if profile_id:
|
||||||
|
conditions.append("ag.profile_id = %s")
|
||||||
|
params.append(profile_id)
|
||||||
|
|
||||||
|
if active_only:
|
||||||
|
conditions.append("ag.is_active = true")
|
||||||
|
conditions.append("ag.valid_until > CURRENT_TIMESTAMP")
|
||||||
|
|
||||||
|
if conditions:
|
||||||
|
query += " WHERE " + " AND ".join(conditions)
|
||||||
|
|
||||||
|
query += " ORDER BY ag.valid_until DESC"
|
||||||
|
|
||||||
|
cur.execute(query, params)
|
||||||
|
return [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("")
|
||||||
|
def create_access_grant(data: dict, session: dict = Depends(require_admin)):
|
||||||
|
"""
|
||||||
|
Admin: Manually create access grant.
|
||||||
|
|
||||||
|
Body:
|
||||||
|
{
|
||||||
|
"profile_id": "uuid",
|
||||||
|
"tier_id": "premium",
|
||||||
|
"duration_days": 30,
|
||||||
|
"reason": "Compensation for bug"
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
profile_id = data.get('profile_id')
|
||||||
|
tier_id = data.get('tier_id')
|
||||||
|
duration_days = data.get('duration_days')
|
||||||
|
reason = data.get('reason', '')
|
||||||
|
|
||||||
|
if not profile_id or not tier_id or not duration_days:
|
||||||
|
raise HTTPException(400, "profile_id, tier_id und duration_days fehlen")
|
||||||
|
|
||||||
|
valid_from = datetime.now()
|
||||||
|
valid_until = valid_from + timedelta(days=duration_days)
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Create grant
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO access_grants (
|
||||||
|
profile_id, tier_id, granted_by, valid_from, valid_until
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, 'admin', %s, %s)
|
||||||
|
RETURNING id
|
||||||
|
""", (profile_id, tier_id, valid_from, valid_until))
|
||||||
|
|
||||||
|
grant_id = cur.fetchone()['id']
|
||||||
|
|
||||||
|
# Log activity
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO user_activity_log (profile_id, action, details)
|
||||||
|
VALUES (%s, 'access_grant_created', %s)
|
||||||
|
""", (
|
||||||
|
profile_id,
|
||||||
|
f'{{"tier": "{tier_id}", "duration_days": {duration_days}, "reason": "{reason}"}}'
|
||||||
|
))
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"ok": True,
|
||||||
|
"id": grant_id,
|
||||||
|
"valid_until": valid_until.isoformat()
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{grant_id}")
|
||||||
|
def update_access_grant(grant_id: str, data: dict, session: dict = Depends(require_admin)):
|
||||||
|
"""
|
||||||
|
Admin: Update access grant (e.g., extend duration, pause/resume).
|
||||||
|
|
||||||
|
Body:
|
||||||
|
{
|
||||||
|
"is_active": false, // Pause grant
|
||||||
|
"valid_until": "2026-12-31T23:59:59" // Extend
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
updates = []
|
||||||
|
values = []
|
||||||
|
|
||||||
|
if 'is_active' in data:
|
||||||
|
updates.append('is_active = %s')
|
||||||
|
values.append(data['is_active'])
|
||||||
|
|
||||||
|
if not data['is_active']:
|
||||||
|
# Pausing - calculate remaining days
|
||||||
|
cur.execute("SELECT valid_until FROM access_grants WHERE id = %s", (grant_id,))
|
||||||
|
grant = cur.fetchone()
|
||||||
|
if grant:
|
||||||
|
remaining = (grant['valid_until'] - datetime.now()).days
|
||||||
|
updates.append('remaining_days = %s')
|
||||||
|
values.append(remaining)
|
||||||
|
updates.append('paused_at = CURRENT_TIMESTAMP')
|
||||||
|
|
||||||
|
if 'valid_until' in data:
|
||||||
|
updates.append('valid_until = %s')
|
||||||
|
values.append(data['valid_until'])
|
||||||
|
|
||||||
|
if not updates:
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
updates.append('updated = CURRENT_TIMESTAMP')
|
||||||
|
values.append(grant_id)
|
||||||
|
|
||||||
|
cur.execute(
|
||||||
|
f"UPDATE access_grants SET {', '.join(updates)} WHERE id = %s",
|
||||||
|
values
|
||||||
|
)
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{grant_id}")
|
||||||
|
def revoke_access_grant(grant_id: str, session: dict = Depends(require_admin)):
|
||||||
|
"""Admin: Revoke access grant (hard delete)."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Get grant info for logging
|
||||||
|
cur.execute("SELECT profile_id, tier_id FROM access_grants WHERE id = %s", (grant_id,))
|
||||||
|
grant = cur.fetchone()
|
||||||
|
|
||||||
|
if grant:
|
||||||
|
# Log revocation
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO user_activity_log (profile_id, action, details)
|
||||||
|
VALUES (%s, 'access_grant_revoked', %s)
|
||||||
|
""", (
|
||||||
|
grant['profile_id'],
|
||||||
|
f'{{"grant_id": "{grant_id}", "tier": "{grant["tier_id"]}"}}'
|
||||||
|
))
|
||||||
|
|
||||||
|
# Delete grant
|
||||||
|
cur.execute("DELETE FROM access_grants WHERE id = %s", (grant_id,))
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return {"ok": True}
|
||||||
460
backend/routers/activity.py
Normal file
460
backend/routers/activity.py
Normal file
|
|
@ -0,0 +1,460 @@
|
||||||
|
"""
|
||||||
|
Activity Tracking Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
Handles workout/activity logging, statistics, and Apple Health CSV import.
|
||||||
|
"""
|
||||||
|
import csv
|
||||||
|
import io
|
||||||
|
import uuid
|
||||||
|
import logging
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from fastapi import APIRouter, HTTPException, UploadFile, File, Header, Depends
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_auth, check_feature_access, increment_feature_usage
|
||||||
|
from models import ActivityEntry
|
||||||
|
from routers.profiles import get_pid
|
||||||
|
from feature_logger import log_feature_usage
|
||||||
|
from quality_filter import get_quality_filter_sql
|
||||||
|
|
||||||
|
# Evaluation import with error handling (Phase 1.2)
|
||||||
|
try:
|
||||||
|
from evaluation_helper import evaluate_and_save_activity
|
||||||
|
EVALUATION_AVAILABLE = True
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"[AUTO-EVAL] Evaluation system not available: {e}")
|
||||||
|
EVALUATION_AVAILABLE = False
|
||||||
|
evaluate_and_save_activity = None
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/activity", tags=["activity"])
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def list_activity(limit: int=200, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Get activity entries for current profile."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Issue #31: Apply global quality filter
|
||||||
|
cur.execute("SELECT * FROM profiles WHERE id=%s", (pid,))
|
||||||
|
profile = r2d(cur.fetchone())
|
||||||
|
quality_filter = get_quality_filter_sql(profile)
|
||||||
|
|
||||||
|
cur.execute(f"""
|
||||||
|
SELECT * FROM activity_log
|
||||||
|
WHERE profile_id=%s
|
||||||
|
{quality_filter}
|
||||||
|
ORDER BY date DESC, start_time DESC
|
||||||
|
LIMIT %s
|
||||||
|
""", (pid, limit))
|
||||||
|
return [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("")
|
||||||
|
def create_activity(e: ActivityEntry, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Create new activity entry."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
# Phase 4: Check feature access and ENFORCE
|
||||||
|
access = check_feature_access(pid, 'activity_entries')
|
||||||
|
log_feature_usage(pid, 'activity_entries', access, 'create')
|
||||||
|
|
||||||
|
if not access['allowed']:
|
||||||
|
logger.warning(
|
||||||
|
f"[FEATURE-LIMIT] User {pid} blocked: "
|
||||||
|
f"activity_entries {access['reason']} (used: {access['used']}, limit: {access['limit']})"
|
||||||
|
)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=403,
|
||||||
|
detail=f"Limit erreicht: Du hast das Kontingent für Aktivitätseinträge überschritten ({access['used']}/{access['limit']}). "
|
||||||
|
f"Bitte kontaktiere den Admin oder warte bis zum nächsten Reset."
|
||||||
|
)
|
||||||
|
|
||||||
|
eid = str(uuid.uuid4())
|
||||||
|
d = e.model_dump()
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""INSERT INTO activity_log
|
||||||
|
(id,profile_id,date,start_time,end_time,activity_type,duration_min,kcal_active,kcal_resting,
|
||||||
|
hr_avg,hr_max,distance_km,rpe,source,notes,created)
|
||||||
|
VALUES (%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,CURRENT_TIMESTAMP)""",
|
||||||
|
(eid,pid,d['date'],d['start_time'],d['end_time'],d['activity_type'],d['duration_min'],
|
||||||
|
d['kcal_active'],d['kcal_resting'],d['hr_avg'],d['hr_max'],d['distance_km'],
|
||||||
|
d['rpe'],d['source'],d['notes']))
|
||||||
|
|
||||||
|
# Phase 1.2: Auto-evaluation after INSERT
|
||||||
|
if EVALUATION_AVAILABLE:
|
||||||
|
# Load the activity data to evaluate
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id, profile_id, date, training_type_id, duration_min,
|
||||||
|
hr_avg, hr_max, distance_km, kcal_active, kcal_resting,
|
||||||
|
rpe, pace_min_per_km, cadence, elevation_gain
|
||||||
|
FROM activity_log
|
||||||
|
WHERE id = %s
|
||||||
|
""", (eid,))
|
||||||
|
activity_row = cur.fetchone()
|
||||||
|
if activity_row:
|
||||||
|
activity_dict = dict(activity_row)
|
||||||
|
training_type_id = activity_dict.get("training_type_id")
|
||||||
|
if training_type_id:
|
||||||
|
try:
|
||||||
|
evaluate_and_save_activity(cur, eid, activity_dict, training_type_id, pid)
|
||||||
|
logger.info(f"[AUTO-EVAL] Evaluated activity {eid} on INSERT")
|
||||||
|
except Exception as eval_error:
|
||||||
|
logger.error(f"[AUTO-EVAL] Failed to evaluate activity {eid}: {eval_error}")
|
||||||
|
|
||||||
|
# Phase 2: Increment usage counter (always for new entries)
|
||||||
|
increment_feature_usage(pid, 'activity_entries')
|
||||||
|
|
||||||
|
return {"id":eid,"date":e.date}
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{eid}")
|
||||||
|
def update_activity(eid: str, e: ActivityEntry, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Update existing activity entry."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
d = e.model_dump()
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(f"UPDATE activity_log SET {', '.join(f'{k}=%s' for k in d)} WHERE id=%s AND profile_id=%s",
|
||||||
|
list(d.values())+[eid,pid])
|
||||||
|
|
||||||
|
# Phase 1.2: Auto-evaluation after UPDATE
|
||||||
|
if EVALUATION_AVAILABLE:
|
||||||
|
# Load the updated activity data to evaluate
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id, profile_id, date, training_type_id, duration_min,
|
||||||
|
hr_avg, hr_max, distance_km, kcal_active, kcal_resting,
|
||||||
|
rpe, pace_min_per_km, cadence, elevation_gain
|
||||||
|
FROM activity_log
|
||||||
|
WHERE id = %s
|
||||||
|
""", (eid,))
|
||||||
|
activity_row = cur.fetchone()
|
||||||
|
if activity_row:
|
||||||
|
activity_dict = dict(activity_row)
|
||||||
|
training_type_id = activity_dict.get("training_type_id")
|
||||||
|
if training_type_id:
|
||||||
|
try:
|
||||||
|
evaluate_and_save_activity(cur, eid, activity_dict, training_type_id, pid)
|
||||||
|
logger.info(f"[AUTO-EVAL] Re-evaluated activity {eid} on UPDATE")
|
||||||
|
except Exception as eval_error:
|
||||||
|
logger.error(f"[AUTO-EVAL] Failed to re-evaluate activity {eid}: {eval_error}")
|
||||||
|
|
||||||
|
return {"id":eid}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{eid}")
|
||||||
|
def delete_activity(eid: str, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Delete activity entry."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("DELETE FROM activity_log WHERE id=%s AND profile_id=%s", (eid,pid))
|
||||||
|
return {"ok":True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/stats")
|
||||||
|
def activity_stats(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Get activity statistics (last 30 entries)."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"SELECT * FROM activity_log WHERE profile_id=%s ORDER BY date DESC LIMIT 30", (pid,))
|
||||||
|
rows = [r2d(r) for r in cur.fetchall()]
|
||||||
|
if not rows: return {"count":0,"total_kcal":0,"total_min":0,"by_type":{}}
|
||||||
|
total_kcal=sum(float(r.get('kcal_active') or 0) for r in rows)
|
||||||
|
total_min=sum(float(r.get('duration_min') or 0) for r in rows)
|
||||||
|
by_type={}
|
||||||
|
for r in rows:
|
||||||
|
t=r['activity_type']; by_type.setdefault(t,{'count':0,'kcal':0,'min':0})
|
||||||
|
by_type[t]['count']+=1
|
||||||
|
by_type[t]['kcal']+=float(r.get('kcal_active') or 0)
|
||||||
|
by_type[t]['min']+=float(r.get('duration_min') or 0)
|
||||||
|
return {"count":len(rows),"total_kcal":round(total_kcal),"total_min":round(total_min),"by_type":by_type}
|
||||||
|
|
||||||
|
|
||||||
|
def get_training_type_for_activity(activity_type: str, profile_id: str = None):
|
||||||
|
"""
|
||||||
|
Map activity_type to training_type_id using database mappings.
|
||||||
|
|
||||||
|
Priority:
|
||||||
|
1. User-specific mapping (profile_id)
|
||||||
|
2. Global mapping (profile_id = NULL)
|
||||||
|
3. No mapping found → returns (None, None, None)
|
||||||
|
|
||||||
|
Returns: (training_type_id, category, subcategory) or (None, None, None)
|
||||||
|
"""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Try user-specific mapping first
|
||||||
|
if profile_id:
|
||||||
|
cur.execute("""
|
||||||
|
SELECT m.training_type_id, t.category, t.subcategory
|
||||||
|
FROM activity_type_mappings m
|
||||||
|
JOIN training_types t ON m.training_type_id = t.id
|
||||||
|
WHERE m.activity_type = %s AND m.profile_id = %s
|
||||||
|
LIMIT 1
|
||||||
|
""", (activity_type, profile_id))
|
||||||
|
row = cur.fetchone()
|
||||||
|
if row:
|
||||||
|
return (row['training_type_id'], row['category'], row['subcategory'])
|
||||||
|
|
||||||
|
# Try global mapping
|
||||||
|
cur.execute("""
|
||||||
|
SELECT m.training_type_id, t.category, t.subcategory
|
||||||
|
FROM activity_type_mappings m
|
||||||
|
JOIN training_types t ON m.training_type_id = t.id
|
||||||
|
WHERE m.activity_type = %s AND m.profile_id IS NULL
|
||||||
|
LIMIT 1
|
||||||
|
""", (activity_type,))
|
||||||
|
row = cur.fetchone()
|
||||||
|
if row:
|
||||||
|
return (row['training_type_id'], row['category'], row['subcategory'])
|
||||||
|
|
||||||
|
return (None, None, None)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/uncategorized")
|
||||||
|
def list_uncategorized_activities(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Get activities without assigned training type, grouped by activity_type."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT activity_type, COUNT(*) as count,
|
||||||
|
MIN(date) as first_date, MAX(date) as last_date
|
||||||
|
FROM activity_log
|
||||||
|
WHERE profile_id=%s AND training_type_id IS NULL
|
||||||
|
GROUP BY activity_type
|
||||||
|
ORDER BY count DESC
|
||||||
|
""", (pid,))
|
||||||
|
return [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/bulk-categorize")
|
||||||
|
def bulk_categorize_activities(
|
||||||
|
data: dict,
|
||||||
|
x_profile_id: Optional[str]=Header(default=None),
|
||||||
|
session: dict=Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Bulk update training type for activities.
|
||||||
|
|
||||||
|
Also saves the mapping to activity_type_mappings for future imports.
|
||||||
|
|
||||||
|
Body: {
|
||||||
|
"activity_type": "Running",
|
||||||
|
"training_type_id": 1,
|
||||||
|
"training_category": "cardio",
|
||||||
|
"training_subcategory": "running"
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
activity_type = data.get('activity_type')
|
||||||
|
training_type_id = data.get('training_type_id')
|
||||||
|
training_category = data.get('training_category')
|
||||||
|
training_subcategory = data.get('training_subcategory')
|
||||||
|
|
||||||
|
if not activity_type or not training_type_id:
|
||||||
|
raise HTTPException(400, "activity_type and training_type_id required")
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Update existing activities
|
||||||
|
cur.execute("""
|
||||||
|
UPDATE activity_log
|
||||||
|
SET training_type_id = %s,
|
||||||
|
training_category = %s,
|
||||||
|
training_subcategory = %s
|
||||||
|
WHERE profile_id = %s
|
||||||
|
AND activity_type = %s
|
||||||
|
AND training_type_id IS NULL
|
||||||
|
""", (training_type_id, training_category, training_subcategory, pid, activity_type))
|
||||||
|
updated_count = cur.rowcount
|
||||||
|
|
||||||
|
# Phase 1.2: Auto-evaluation after bulk categorization
|
||||||
|
if EVALUATION_AVAILABLE:
|
||||||
|
# Load all activities that were just updated and evaluate them
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id, profile_id, date, training_type_id, duration_min,
|
||||||
|
hr_avg, hr_max, distance_km, kcal_active, kcal_resting,
|
||||||
|
rpe, pace_min_per_km, cadence, elevation_gain
|
||||||
|
FROM activity_log
|
||||||
|
WHERE profile_id = %s
|
||||||
|
AND activity_type = %s
|
||||||
|
AND training_type_id = %s
|
||||||
|
""", (pid, activity_type, training_type_id))
|
||||||
|
|
||||||
|
activities_to_evaluate = cur.fetchall()
|
||||||
|
evaluated_count = 0
|
||||||
|
for activity_row in activities_to_evaluate:
|
||||||
|
activity_dict = dict(activity_row)
|
||||||
|
try:
|
||||||
|
evaluate_and_save_activity(cur, activity_dict["id"], activity_dict, training_type_id, pid)
|
||||||
|
evaluated_count += 1
|
||||||
|
except Exception as eval_error:
|
||||||
|
logger.warning(f"[AUTO-EVAL] Failed to evaluate bulk-categorized activity {activity_dict['id']}: {eval_error}")
|
||||||
|
|
||||||
|
logger.info(f"[AUTO-EVAL] Evaluated {evaluated_count}/{updated_count} bulk-categorized activities")
|
||||||
|
|
||||||
|
# Save mapping for future imports (upsert)
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO activity_type_mappings (activity_type, training_type_id, profile_id, source, updated_at)
|
||||||
|
VALUES (%s, %s, %s, 'bulk', CURRENT_TIMESTAMP)
|
||||||
|
ON CONFLICT (activity_type, profile_id)
|
||||||
|
DO UPDATE SET
|
||||||
|
training_type_id = EXCLUDED.training_type_id,
|
||||||
|
source = 'bulk',
|
||||||
|
updated_at = CURRENT_TIMESTAMP
|
||||||
|
""", (activity_type, training_type_id, pid))
|
||||||
|
|
||||||
|
logger.info(f"[MAPPING] Saved bulk mapping: {activity_type} → training_type_id {training_type_id} (profile {pid})")
|
||||||
|
|
||||||
|
return {"updated": updated_count, "activity_type": activity_type, "mapping_saved": True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/import-csv")
|
||||||
|
async def import_activity_csv(file: UploadFile=File(...), x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Import Apple Health workout CSV with automatic training type mapping."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
raw = await file.read()
|
||||||
|
try: text = raw.decode('utf-8')
|
||||||
|
except: text = raw.decode('latin-1')
|
||||||
|
if text.startswith('\ufeff'): text = text[1:]
|
||||||
|
if not text.strip(): raise HTTPException(400,"Leere Datei")
|
||||||
|
reader = csv.DictReader(io.StringIO(text))
|
||||||
|
inserted = skipped = 0
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
for row in reader:
|
||||||
|
wtype = row.get('Workout Type','').strip()
|
||||||
|
start = row.get('Start','').strip()
|
||||||
|
if not wtype or not start: continue
|
||||||
|
try: date = start[:10]
|
||||||
|
except: continue
|
||||||
|
dur = row.get('Duration','').strip()
|
||||||
|
duration_min = None
|
||||||
|
if dur:
|
||||||
|
try:
|
||||||
|
p = dur.split(':')
|
||||||
|
duration_min = round(int(p[0])*60+int(p[1])+int(p[2])/60,1)
|
||||||
|
except: pass
|
||||||
|
def kj(v):
|
||||||
|
try: return round(float(v)/4.184) if v else None
|
||||||
|
except: return None
|
||||||
|
def tf(v):
|
||||||
|
try: return round(float(v),1) if v else None
|
||||||
|
except: return None
|
||||||
|
# Map activity_type to training_type_id using database mappings
|
||||||
|
training_type_id, training_category, training_subcategory = get_training_type_for_activity(wtype, pid)
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Check if entry already exists (duplicate detection by date + start_time)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id FROM activity_log
|
||||||
|
WHERE profile_id = %s AND date = %s AND start_time = %s
|
||||||
|
""", (pid, date, start))
|
||||||
|
existing = cur.fetchone()
|
||||||
|
|
||||||
|
if existing:
|
||||||
|
# Update existing entry (e.g., to add training type mapping)
|
||||||
|
existing_id = existing['id']
|
||||||
|
cur.execute("""
|
||||||
|
UPDATE activity_log
|
||||||
|
SET end_time = %s,
|
||||||
|
activity_type = %s,
|
||||||
|
duration_min = %s,
|
||||||
|
kcal_active = %s,
|
||||||
|
kcal_resting = %s,
|
||||||
|
hr_avg = %s,
|
||||||
|
hr_max = %s,
|
||||||
|
distance_km = %s,
|
||||||
|
training_type_id = %s,
|
||||||
|
training_category = %s,
|
||||||
|
training_subcategory = %s
|
||||||
|
WHERE id = %s
|
||||||
|
""", (
|
||||||
|
row.get('End',''), wtype, duration_min,
|
||||||
|
kj(row.get('Aktive Energie (kJ)','')),
|
||||||
|
kj(row.get('Ruheeinträge (kJ)','')),
|
||||||
|
tf(row.get('Durchschn. Herzfrequenz (count/min)','')),
|
||||||
|
tf(row.get('Max. Herzfrequenz (count/min)','')),
|
||||||
|
tf(row.get('Distanz (km)','')),
|
||||||
|
training_type_id, training_category, training_subcategory,
|
||||||
|
existing_id
|
||||||
|
))
|
||||||
|
skipped += 1 # Count as skipped (not newly inserted)
|
||||||
|
|
||||||
|
# Phase 1.2: Auto-evaluation after CSV import UPDATE
|
||||||
|
if EVALUATION_AVAILABLE and training_type_id:
|
||||||
|
try:
|
||||||
|
# Build activity dict for evaluation
|
||||||
|
activity_dict = {
|
||||||
|
"id": existing_id,
|
||||||
|
"profile_id": pid,
|
||||||
|
"date": date,
|
||||||
|
"training_type_id": training_type_id,
|
||||||
|
"duration_min": duration_min,
|
||||||
|
"hr_avg": tf(row.get('Durchschn. Herzfrequenz (count/min)','')),
|
||||||
|
"hr_max": tf(row.get('Max. Herzfrequenz (count/min)','')),
|
||||||
|
"distance_km": tf(row.get('Distanz (km)','')),
|
||||||
|
"kcal_active": kj(row.get('Aktive Energie (kJ)','')),
|
||||||
|
"kcal_resting": kj(row.get('Ruheeinträge (kJ)','')),
|
||||||
|
"rpe": None,
|
||||||
|
"pace_min_per_km": None,
|
||||||
|
"cadence": None,
|
||||||
|
"elevation_gain": None
|
||||||
|
}
|
||||||
|
evaluate_and_save_activity(cur, existing_id, activity_dict, training_type_id, pid)
|
||||||
|
logger.debug(f"[AUTO-EVAL] Re-evaluated updated activity {existing_id}")
|
||||||
|
except Exception as eval_error:
|
||||||
|
logger.warning(f"[AUTO-EVAL] Failed to re-evaluate updated activity {existing_id}: {eval_error}")
|
||||||
|
else:
|
||||||
|
# Insert new entry
|
||||||
|
new_id = str(uuid.uuid4())
|
||||||
|
cur.execute("""INSERT INTO activity_log
|
||||||
|
(id,profile_id,date,start_time,end_time,activity_type,duration_min,kcal_active,kcal_resting,
|
||||||
|
hr_avg,hr_max,distance_km,source,training_type_id,training_category,training_subcategory,created)
|
||||||
|
VALUES (%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,'apple_health',%s,%s,%s,CURRENT_TIMESTAMP)""",
|
||||||
|
(new_id,pid,date,start,row.get('End',''),wtype,duration_min,
|
||||||
|
kj(row.get('Aktive Energie (kJ)','')),kj(row.get('Ruheeinträge (kJ)','')),
|
||||||
|
tf(row.get('Durchschn. Herzfrequenz (count/min)','')),
|
||||||
|
tf(row.get('Max. Herzfrequenz (count/min)','')),
|
||||||
|
tf(row.get('Distanz (km)','')),
|
||||||
|
training_type_id,training_category,training_subcategory))
|
||||||
|
inserted+=1
|
||||||
|
|
||||||
|
# Phase 1.2: Auto-evaluation after CSV import INSERT
|
||||||
|
if EVALUATION_AVAILABLE and training_type_id:
|
||||||
|
try:
|
||||||
|
# Build activity dict for evaluation
|
||||||
|
activity_dict = {
|
||||||
|
"id": new_id,
|
||||||
|
"profile_id": pid,
|
||||||
|
"date": date,
|
||||||
|
"training_type_id": training_type_id,
|
||||||
|
"duration_min": duration_min,
|
||||||
|
"hr_avg": tf(row.get('Durchschn. Herzfrequenz (count/min)','')),
|
||||||
|
"hr_max": tf(row.get('Max. Herzfrequenz (count/min)','')),
|
||||||
|
"distance_km": tf(row.get('Distanz (km)','')),
|
||||||
|
"kcal_active": kj(row.get('Aktive Energie (kJ)','')),
|
||||||
|
"kcal_resting": kj(row.get('Ruheeinträge (kJ)','')),
|
||||||
|
"rpe": None,
|
||||||
|
"pace_min_per_km": None,
|
||||||
|
"cadence": None,
|
||||||
|
"elevation_gain": None
|
||||||
|
}
|
||||||
|
evaluate_and_save_activity(cur, new_id, activity_dict, training_type_id, pid)
|
||||||
|
logger.debug(f"[AUTO-EVAL] Evaluated imported activity {new_id}")
|
||||||
|
except Exception as eval_error:
|
||||||
|
logger.warning(f"[AUTO-EVAL] Failed to evaluate imported activity {new_id}: {eval_error}")
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"Import row failed: {e}")
|
||||||
|
skipped+=1
|
||||||
|
return {"inserted":inserted,"skipped":skipped,"message":f"{inserted} Trainings importiert"}
|
||||||
157
backend/routers/admin.py
Normal file
157
backend/routers/admin.py
Normal file
|
|
@ -0,0 +1,157 @@
|
||||||
|
"""
|
||||||
|
Admin Management Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
Handles user management, permissions, and email testing (admin-only).
|
||||||
|
"""
|
||||||
|
import os
|
||||||
|
import smtplib
|
||||||
|
from email.mime.text import MIMEText
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
from fastapi import APIRouter, HTTPException, Depends
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_admin, hash_pin
|
||||||
|
from models import AdminProfileUpdate
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/admin", tags=["admin"])
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/profiles")
|
||||||
|
def admin_list_profiles(session: dict=Depends(require_admin)):
|
||||||
|
"""Admin: List all profiles with stats."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT * FROM profiles ORDER BY created")
|
||||||
|
profs = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
for p in profs:
|
||||||
|
pid = p['id']
|
||||||
|
cur.execute("SELECT COUNT(*) as count FROM weight_log WHERE profile_id=%s", (pid,))
|
||||||
|
p['weight_count'] = cur.fetchone()['count']
|
||||||
|
cur.execute("SELECT COUNT(*) as count FROM ai_insights WHERE profile_id=%s", (pid,))
|
||||||
|
p['ai_insights_count'] = cur.fetchone()['count']
|
||||||
|
|
||||||
|
today = datetime.now().date().isoformat()
|
||||||
|
cur.execute("SELECT call_count FROM ai_usage WHERE profile_id=%s AND date=%s", (pid, today))
|
||||||
|
usage = cur.fetchone()
|
||||||
|
p['ai_usage_today'] = usage['call_count'] if usage else 0
|
||||||
|
|
||||||
|
return profs
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/profiles/{pid}")
|
||||||
|
def admin_update_profile(pid: str, data: AdminProfileUpdate, session: dict=Depends(require_admin)):
|
||||||
|
"""Admin: Update profile settings."""
|
||||||
|
with get_db() as conn:
|
||||||
|
updates = {k:v for k,v in data.model_dump().items() if v is not None}
|
||||||
|
if not updates:
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(f"UPDATE profiles SET {', '.join(f'{k}=%s' for k in updates)} WHERE id=%s",
|
||||||
|
list(updates.values()) + [pid])
|
||||||
|
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/profiles/{pid}/permissions")
|
||||||
|
def admin_set_permissions(pid: str, data: dict, session: dict=Depends(require_admin)):
|
||||||
|
"""Admin: Set profile permissions."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
updates = []
|
||||||
|
values = []
|
||||||
|
if 'ai_enabled' in data:
|
||||||
|
updates.append('ai_enabled=%s')
|
||||||
|
values.append(data['ai_enabled'])
|
||||||
|
if 'ai_limit_day' in data:
|
||||||
|
updates.append('ai_limit_day=%s')
|
||||||
|
values.append(data['ai_limit_day'])
|
||||||
|
if 'export_enabled' in data:
|
||||||
|
updates.append('export_enabled=%s')
|
||||||
|
values.append(data['export_enabled'])
|
||||||
|
if 'role' in data:
|
||||||
|
updates.append('role=%s')
|
||||||
|
values.append(data['role'])
|
||||||
|
|
||||||
|
if updates:
|
||||||
|
cur.execute(f"UPDATE profiles SET {', '.join(updates)} WHERE id=%s", values + [pid])
|
||||||
|
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/profiles/{pid}/email")
|
||||||
|
def admin_set_email(pid: str, data: dict, session: dict=Depends(require_admin)):
|
||||||
|
"""Admin: Set profile email."""
|
||||||
|
email = data.get('email', '').strip().lower()
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("UPDATE profiles SET email=%s WHERE id=%s", (email if email else None, pid))
|
||||||
|
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/profiles/{pid}/pin")
|
||||||
|
def admin_set_pin(pid: str, data: dict, session: dict=Depends(require_admin)):
|
||||||
|
"""Admin: Set profile PIN/password."""
|
||||||
|
new_pin = data.get('pin', '')
|
||||||
|
if len(new_pin) < 4:
|
||||||
|
raise HTTPException(400, "PIN/Passwort muss mind. 4 Zeichen haben")
|
||||||
|
|
||||||
|
new_hash = hash_pin(new_pin)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("UPDATE profiles SET pin_hash=%s WHERE id=%s", (new_hash, pid))
|
||||||
|
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/email/status")
|
||||||
|
def admin_email_status(session: dict=Depends(require_admin)):
|
||||||
|
"""Admin: Check email configuration status."""
|
||||||
|
smtp_host = os.getenv("SMTP_HOST")
|
||||||
|
smtp_user = os.getenv("SMTP_USER")
|
||||||
|
smtp_pass = os.getenv("SMTP_PASS")
|
||||||
|
app_url = os.getenv("APP_URL", "http://localhost:3002")
|
||||||
|
|
||||||
|
configured = bool(smtp_host and smtp_user and smtp_pass)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"configured": configured,
|
||||||
|
"smtp_host": smtp_host or "",
|
||||||
|
"smtp_user": smtp_user or "",
|
||||||
|
"app_url": app_url
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/email/test")
|
||||||
|
def admin_test_email(data: dict, session: dict=Depends(require_admin)):
|
||||||
|
"""Admin: Send test email."""
|
||||||
|
email = data.get('to', '')
|
||||||
|
if not email:
|
||||||
|
raise HTTPException(400, "E-Mail-Adresse fehlt")
|
||||||
|
|
||||||
|
try:
|
||||||
|
smtp_host = os.getenv("SMTP_HOST")
|
||||||
|
smtp_port = int(os.getenv("SMTP_PORT", 587))
|
||||||
|
smtp_user = os.getenv("SMTP_USER")
|
||||||
|
smtp_pass = os.getenv("SMTP_PASS")
|
||||||
|
smtp_from = os.getenv("SMTP_FROM")
|
||||||
|
|
||||||
|
if not smtp_host or not smtp_user or not smtp_pass:
|
||||||
|
raise HTTPException(500, "SMTP nicht konfiguriert")
|
||||||
|
|
||||||
|
msg = MIMEText("Dies ist eine Test-E-Mail von Mitai Jinkendo.")
|
||||||
|
msg['Subject'] = "Test-E-Mail"
|
||||||
|
msg['From'] = smtp_from
|
||||||
|
msg['To'] = email
|
||||||
|
|
||||||
|
with smtplib.SMTP(smtp_host, smtp_port) as server:
|
||||||
|
server.starttls()
|
||||||
|
server.login(smtp_user, smtp_pass)
|
||||||
|
server.send_message(msg)
|
||||||
|
|
||||||
|
return {"ok": True, "message": f"Test-E-Mail an {email} gesendet"}
|
||||||
|
except Exception as e:
|
||||||
|
raise HTTPException(500, f"Fehler beim Senden: {str(e)}")
|
||||||
219
backend/routers/admin_activity_mappings.py
Normal file
219
backend/routers/admin_activity_mappings.py
Normal file
|
|
@ -0,0 +1,219 @@
|
||||||
|
"""
|
||||||
|
Admin Activity Type Mappings Management - v9d Phase 1b
|
||||||
|
|
||||||
|
CRUD operations for activity_type_mappings (learnable system).
|
||||||
|
"""
|
||||||
|
import logging
|
||||||
|
from typing import Optional
|
||||||
|
from fastapi import APIRouter, HTTPException, Depends
|
||||||
|
from pydantic import BaseModel
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_admin
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/admin/activity-mappings", tags=["admin", "activity-mappings"])
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class ActivityMappingCreate(BaseModel):
|
||||||
|
activity_type: str
|
||||||
|
training_type_id: int
|
||||||
|
profile_id: Optional[str] = None
|
||||||
|
source: str = 'admin'
|
||||||
|
|
||||||
|
|
||||||
|
class ActivityMappingUpdate(BaseModel):
|
||||||
|
training_type_id: Optional[int] = None
|
||||||
|
profile_id: Optional[str] = None
|
||||||
|
source: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def list_activity_mappings(
|
||||||
|
profile_id: Optional[str] = None,
|
||||||
|
global_only: bool = False,
|
||||||
|
session: dict = Depends(require_admin)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Get all activity type mappings.
|
||||||
|
|
||||||
|
Filters:
|
||||||
|
- profile_id: Show only mappings for specific profile
|
||||||
|
- global_only: Show only global mappings (profile_id IS NULL)
|
||||||
|
"""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
query = """
|
||||||
|
SELECT m.id, m.activity_type, m.training_type_id, m.profile_id, m.source,
|
||||||
|
m.created_at, m.updated_at,
|
||||||
|
t.name_de as training_type_name_de,
|
||||||
|
t.category, t.subcategory, t.icon
|
||||||
|
FROM activity_type_mappings m
|
||||||
|
JOIN training_types t ON m.training_type_id = t.id
|
||||||
|
"""
|
||||||
|
|
||||||
|
conditions = []
|
||||||
|
params = []
|
||||||
|
|
||||||
|
if global_only:
|
||||||
|
conditions.append("m.profile_id IS NULL")
|
||||||
|
elif profile_id:
|
||||||
|
conditions.append("m.profile_id = %s")
|
||||||
|
params.append(profile_id)
|
||||||
|
|
||||||
|
if conditions:
|
||||||
|
query += " WHERE " + " AND ".join(conditions)
|
||||||
|
|
||||||
|
query += " ORDER BY m.activity_type"
|
||||||
|
|
||||||
|
cur.execute(query, params)
|
||||||
|
rows = cur.fetchall()
|
||||||
|
|
||||||
|
return [r2d(r) for r in rows]
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{mapping_id}")
|
||||||
|
def get_activity_mapping(mapping_id: int, session: dict = Depends(require_admin)):
|
||||||
|
"""Get single activity mapping by ID."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT m.id, m.activity_type, m.training_type_id, m.profile_id, m.source,
|
||||||
|
m.created_at, m.updated_at,
|
||||||
|
t.name_de as training_type_name_de,
|
||||||
|
t.category, t.subcategory
|
||||||
|
FROM activity_type_mappings m
|
||||||
|
JOIN training_types t ON m.training_type_id = t.id
|
||||||
|
WHERE m.id = %s
|
||||||
|
""", (mapping_id,))
|
||||||
|
row = cur.fetchone()
|
||||||
|
|
||||||
|
if not row:
|
||||||
|
raise HTTPException(404, "Mapping not found")
|
||||||
|
|
||||||
|
return r2d(row)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("")
|
||||||
|
def create_activity_mapping(data: ActivityMappingCreate, session: dict = Depends(require_admin)):
|
||||||
|
"""
|
||||||
|
Create new activity type mapping.
|
||||||
|
|
||||||
|
Note: Duplicate (activity_type, profile_id) will fail with 409 Conflict.
|
||||||
|
"""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
try:
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO activity_type_mappings
|
||||||
|
(activity_type, training_type_id, profile_id, source)
|
||||||
|
VALUES (%s, %s, %s, %s)
|
||||||
|
RETURNING id
|
||||||
|
""", (
|
||||||
|
data.activity_type,
|
||||||
|
data.training_type_id,
|
||||||
|
data.profile_id,
|
||||||
|
data.source
|
||||||
|
))
|
||||||
|
|
||||||
|
new_id = cur.fetchone()['id']
|
||||||
|
|
||||||
|
logger.info(f"[ADMIN] Mapping created: {data.activity_type} → training_type_id {data.training_type_id} (profile: {data.profile_id})")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
if 'unique_activity_type_per_profile' in str(e):
|
||||||
|
raise HTTPException(409, f"Mapping for '{data.activity_type}' already exists (profile: {data.profile_id})")
|
||||||
|
raise HTTPException(400, f"Failed to create mapping: {str(e)}")
|
||||||
|
|
||||||
|
return {"id": new_id, "message": "Mapping created"}
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{mapping_id}")
|
||||||
|
def update_activity_mapping(
|
||||||
|
mapping_id: int,
|
||||||
|
data: ActivityMappingUpdate,
|
||||||
|
session: dict = Depends(require_admin)
|
||||||
|
):
|
||||||
|
"""Update existing activity type mapping."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Build update query dynamically
|
||||||
|
updates = []
|
||||||
|
values = []
|
||||||
|
|
||||||
|
if data.training_type_id is not None:
|
||||||
|
updates.append("training_type_id = %s")
|
||||||
|
values.append(data.training_type_id)
|
||||||
|
if data.profile_id is not None:
|
||||||
|
updates.append("profile_id = %s")
|
||||||
|
values.append(data.profile_id)
|
||||||
|
if data.source is not None:
|
||||||
|
updates.append("source = %s")
|
||||||
|
values.append(data.source)
|
||||||
|
|
||||||
|
if not updates:
|
||||||
|
raise HTTPException(400, "No fields to update")
|
||||||
|
|
||||||
|
updates.append("updated_at = CURRENT_TIMESTAMP")
|
||||||
|
values.append(mapping_id)
|
||||||
|
|
||||||
|
cur.execute(f"""
|
||||||
|
UPDATE activity_type_mappings
|
||||||
|
SET {', '.join(updates)}
|
||||||
|
WHERE id = %s
|
||||||
|
""", values)
|
||||||
|
|
||||||
|
if cur.rowcount == 0:
|
||||||
|
raise HTTPException(404, "Mapping not found")
|
||||||
|
|
||||||
|
logger.info(f"[ADMIN] Mapping updated: {mapping_id}")
|
||||||
|
|
||||||
|
return {"id": mapping_id, "message": "Mapping updated"}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{mapping_id}")
|
||||||
|
def delete_activity_mapping(mapping_id: int, session: dict = Depends(require_admin)):
|
||||||
|
"""
|
||||||
|
Delete activity type mapping.
|
||||||
|
|
||||||
|
This will cause future imports to NOT auto-assign training type for this activity_type.
|
||||||
|
Existing activities with this mapping remain unchanged.
|
||||||
|
"""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
cur.execute("DELETE FROM activity_type_mappings WHERE id = %s", (mapping_id,))
|
||||||
|
|
||||||
|
if cur.rowcount == 0:
|
||||||
|
raise HTTPException(404, "Mapping not found")
|
||||||
|
|
||||||
|
logger.info(f"[ADMIN] Mapping deleted: {mapping_id}")
|
||||||
|
|
||||||
|
return {"message": "Mapping deleted"}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/stats/coverage")
|
||||||
|
def get_mapping_coverage(session: dict = Depends(require_admin)):
|
||||||
|
"""
|
||||||
|
Get statistics about mapping coverage.
|
||||||
|
|
||||||
|
Returns how many activities are mapped vs unmapped across all profiles.
|
||||||
|
"""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
cur.execute("""
|
||||||
|
SELECT
|
||||||
|
COUNT(*) as total_activities,
|
||||||
|
COUNT(training_type_id) as mapped_activities,
|
||||||
|
COUNT(*) - COUNT(training_type_id) as unmapped_activities,
|
||||||
|
COUNT(DISTINCT activity_type) as unique_activity_types,
|
||||||
|
COUNT(DISTINCT CASE WHEN training_type_id IS NULL THEN activity_type END) as unmapped_types
|
||||||
|
FROM activity_log
|
||||||
|
""")
|
||||||
|
stats = r2d(cur.fetchone())
|
||||||
|
|
||||||
|
return stats
|
||||||
409
backend/routers/admin_training_types.py
Normal file
409
backend/routers/admin_training_types.py
Normal file
|
|
@ -0,0 +1,409 @@
|
||||||
|
"""
|
||||||
|
Admin Training Types Management - v9d Phase 1b
|
||||||
|
|
||||||
|
CRUD operations for training types with abilities mapping preparation.
|
||||||
|
"""
|
||||||
|
import logging
|
||||||
|
from typing import Optional
|
||||||
|
from fastapi import APIRouter, HTTPException, Depends
|
||||||
|
from pydantic import BaseModel
|
||||||
|
from psycopg2.extras import Json
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_auth, require_admin
|
||||||
|
from profile_templates import list_templates, get_template
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/admin/training-types", tags=["admin", "training-types"])
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class TrainingTypeCreate(BaseModel):
|
||||||
|
category: str
|
||||||
|
subcategory: Optional[str] = None
|
||||||
|
name_de: str
|
||||||
|
name_en: str
|
||||||
|
icon: Optional[str] = None
|
||||||
|
description_de: Optional[str] = None
|
||||||
|
description_en: Optional[str] = None
|
||||||
|
sort_order: int = 0
|
||||||
|
abilities: Optional[dict] = None
|
||||||
|
profile: Optional[dict] = None # Training Type Profile (Phase 2 #15)
|
||||||
|
|
||||||
|
|
||||||
|
class TrainingTypeUpdate(BaseModel):
|
||||||
|
category: Optional[str] = None
|
||||||
|
subcategory: Optional[str] = None
|
||||||
|
name_de: Optional[str] = None
|
||||||
|
name_en: Optional[str] = None
|
||||||
|
icon: Optional[str] = None
|
||||||
|
description_de: Optional[str] = None
|
||||||
|
description_en: Optional[str] = None
|
||||||
|
sort_order: Optional[int] = None
|
||||||
|
abilities: Optional[dict] = None
|
||||||
|
profile: Optional[dict] = None # Training Type Profile (Phase 2 #15)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def list_training_types_admin(session: dict = Depends(require_admin)):
|
||||||
|
"""
|
||||||
|
Get all training types for admin management.
|
||||||
|
Returns full details including abilities.
|
||||||
|
"""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id, category, subcategory, name_de, name_en, icon,
|
||||||
|
description_de, description_en, sort_order, abilities,
|
||||||
|
profile, created_at
|
||||||
|
FROM training_types
|
||||||
|
ORDER BY sort_order, category, subcategory
|
||||||
|
""")
|
||||||
|
rows = cur.fetchall()
|
||||||
|
|
||||||
|
return [r2d(r) for r in rows]
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{type_id}")
|
||||||
|
def get_training_type(type_id: int, session: dict = Depends(require_admin)):
|
||||||
|
"""Get single training type by ID."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id, category, subcategory, name_de, name_en, icon,
|
||||||
|
description_de, description_en, sort_order, abilities,
|
||||||
|
profile, created_at
|
||||||
|
FROM training_types
|
||||||
|
WHERE id = %s
|
||||||
|
""", (type_id,))
|
||||||
|
row = cur.fetchone()
|
||||||
|
|
||||||
|
if not row:
|
||||||
|
raise HTTPException(404, "Training type not found")
|
||||||
|
|
||||||
|
return r2d(row)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("")
|
||||||
|
def create_training_type(data: TrainingTypeCreate, session: dict = Depends(require_admin)):
|
||||||
|
"""Create new training type."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Convert abilities and profile dict to JSONB
|
||||||
|
abilities_json = data.abilities if data.abilities else {}
|
||||||
|
profile_json = data.profile if data.profile else None
|
||||||
|
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO training_types
|
||||||
|
(category, subcategory, name_de, name_en, icon,
|
||||||
|
description_de, description_en, sort_order, abilities, profile)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||||
|
RETURNING id
|
||||||
|
""", (
|
||||||
|
data.category,
|
||||||
|
data.subcategory,
|
||||||
|
data.name_de,
|
||||||
|
data.name_en,
|
||||||
|
data.icon,
|
||||||
|
data.description_de,
|
||||||
|
data.description_en,
|
||||||
|
data.sort_order,
|
||||||
|
Json(abilities_json),
|
||||||
|
Json(profile_json) if profile_json else None
|
||||||
|
))
|
||||||
|
|
||||||
|
new_id = cur.fetchone()['id']
|
||||||
|
|
||||||
|
logger.info(f"[ADMIN] Training type created: {new_id} - {data.name_de} ({data.category}/{data.subcategory})")
|
||||||
|
|
||||||
|
return {"id": new_id, "message": "Training type created"}
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{type_id}")
|
||||||
|
def update_training_type(
|
||||||
|
type_id: int,
|
||||||
|
data: TrainingTypeUpdate,
|
||||||
|
session: dict = Depends(require_admin)
|
||||||
|
):
|
||||||
|
"""Update existing training type."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Build update query dynamically
|
||||||
|
updates = []
|
||||||
|
values = []
|
||||||
|
|
||||||
|
if data.category is not None:
|
||||||
|
updates.append("category = %s")
|
||||||
|
values.append(data.category)
|
||||||
|
if data.subcategory is not None:
|
||||||
|
updates.append("subcategory = %s")
|
||||||
|
values.append(data.subcategory)
|
||||||
|
if data.name_de is not None:
|
||||||
|
updates.append("name_de = %s")
|
||||||
|
values.append(data.name_de)
|
||||||
|
if data.name_en is not None:
|
||||||
|
updates.append("name_en = %s")
|
||||||
|
values.append(data.name_en)
|
||||||
|
if data.icon is not None:
|
||||||
|
updates.append("icon = %s")
|
||||||
|
values.append(data.icon)
|
||||||
|
if data.description_de is not None:
|
||||||
|
updates.append("description_de = %s")
|
||||||
|
values.append(data.description_de)
|
||||||
|
if data.description_en is not None:
|
||||||
|
updates.append("description_en = %s")
|
||||||
|
values.append(data.description_en)
|
||||||
|
if data.sort_order is not None:
|
||||||
|
updates.append("sort_order = %s")
|
||||||
|
values.append(data.sort_order)
|
||||||
|
if data.abilities is not None:
|
||||||
|
updates.append("abilities = %s")
|
||||||
|
values.append(Json(data.abilities))
|
||||||
|
if data.profile is not None:
|
||||||
|
updates.append("profile = %s")
|
||||||
|
values.append(Json(data.profile))
|
||||||
|
|
||||||
|
if not updates:
|
||||||
|
raise HTTPException(400, "No fields to update")
|
||||||
|
|
||||||
|
values.append(type_id)
|
||||||
|
|
||||||
|
cur.execute(f"""
|
||||||
|
UPDATE training_types
|
||||||
|
SET {', '.join(updates)}
|
||||||
|
WHERE id = %s
|
||||||
|
""", values)
|
||||||
|
|
||||||
|
if cur.rowcount == 0:
|
||||||
|
raise HTTPException(404, "Training type not found")
|
||||||
|
|
||||||
|
logger.info(f"[ADMIN] Training type updated: {type_id}")
|
||||||
|
|
||||||
|
return {"id": type_id, "message": "Training type updated"}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{type_id}")
|
||||||
|
def delete_training_type(type_id: int, session: dict = Depends(require_admin)):
|
||||||
|
"""
|
||||||
|
Delete training type.
|
||||||
|
|
||||||
|
WARNING: This will fail if any activities reference this type.
|
||||||
|
Consider adding a soft-delete or archive mechanism if needed.
|
||||||
|
"""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Check if any activities use this type
|
||||||
|
cur.execute("""
|
||||||
|
SELECT COUNT(*) as count
|
||||||
|
FROM activity_log
|
||||||
|
WHERE training_type_id = %s
|
||||||
|
""", (type_id,))
|
||||||
|
|
||||||
|
count = cur.fetchone()['count']
|
||||||
|
if count > 0:
|
||||||
|
raise HTTPException(
|
||||||
|
400,
|
||||||
|
f"Cannot delete: {count} activities are using this training type. "
|
||||||
|
"Please reassign or delete those activities first."
|
||||||
|
)
|
||||||
|
|
||||||
|
cur.execute("DELETE FROM training_types WHERE id = %s", (type_id,))
|
||||||
|
|
||||||
|
if cur.rowcount == 0:
|
||||||
|
raise HTTPException(404, "Training type not found")
|
||||||
|
|
||||||
|
logger.info(f"[ADMIN] Training type deleted: {type_id}")
|
||||||
|
|
||||||
|
return {"message": "Training type deleted"}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/taxonomy/abilities")
|
||||||
|
def get_abilities_taxonomy(session: dict = Depends(require_auth)):
|
||||||
|
"""
|
||||||
|
Get abilities taxonomy for UI and AI analysis.
|
||||||
|
|
||||||
|
This defines the 5 dimensions of athletic development.
|
||||||
|
"""
|
||||||
|
taxonomy = {
|
||||||
|
"koordinativ": {
|
||||||
|
"name_de": "Koordinative Fähigkeiten",
|
||||||
|
"name_en": "Coordination Abilities",
|
||||||
|
"icon": "🎯",
|
||||||
|
"abilities": [
|
||||||
|
{"key": "orientierung", "name_de": "Orientierung", "name_en": "Orientation"},
|
||||||
|
{"key": "differenzierung", "name_de": "Differenzierung", "name_en": "Differentiation"},
|
||||||
|
{"key": "kopplung", "name_de": "Kopplung", "name_en": "Coupling"},
|
||||||
|
{"key": "gleichgewicht", "name_de": "Gleichgewicht", "name_en": "Balance"},
|
||||||
|
{"key": "rhythmus", "name_de": "Rhythmisierung", "name_en": "Rhythm"},
|
||||||
|
{"key": "reaktion", "name_de": "Reaktion", "name_en": "Reaction"},
|
||||||
|
{"key": "umstellung", "name_de": "Umstellung", "name_en": "Adaptation"}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"konditionell": {
|
||||||
|
"name_de": "Konditionelle Fähigkeiten",
|
||||||
|
"name_en": "Conditional Abilities",
|
||||||
|
"icon": "💪",
|
||||||
|
"abilities": [
|
||||||
|
{"key": "kraft", "name_de": "Kraft", "name_en": "Strength"},
|
||||||
|
{"key": "ausdauer", "name_de": "Ausdauer", "name_en": "Endurance"},
|
||||||
|
{"key": "schnelligkeit", "name_de": "Schnelligkeit", "name_en": "Speed"},
|
||||||
|
{"key": "flexibilitaet", "name_de": "Flexibilität", "name_en": "Flexibility"}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"kognitiv": {
|
||||||
|
"name_de": "Kognitive Fähigkeiten",
|
||||||
|
"name_en": "Cognitive Abilities",
|
||||||
|
"icon": "🧠",
|
||||||
|
"abilities": [
|
||||||
|
{"key": "konzentration", "name_de": "Konzentration", "name_en": "Concentration"},
|
||||||
|
{"key": "aufmerksamkeit", "name_de": "Aufmerksamkeit", "name_en": "Attention"},
|
||||||
|
{"key": "wahrnehmung", "name_de": "Wahrnehmung", "name_en": "Perception"},
|
||||||
|
{"key": "entscheidung", "name_de": "Entscheidungsfindung", "name_en": "Decision Making"}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"psychisch": {
|
||||||
|
"name_de": "Psychische Fähigkeiten",
|
||||||
|
"name_en": "Psychological Abilities",
|
||||||
|
"icon": "🎭",
|
||||||
|
"abilities": [
|
||||||
|
{"key": "motivation", "name_de": "Motivation", "name_en": "Motivation"},
|
||||||
|
{"key": "willenskraft", "name_de": "Willenskraft", "name_en": "Willpower"},
|
||||||
|
{"key": "stressresistenz", "name_de": "Stressresistenz", "name_en": "Stress Resistance"},
|
||||||
|
{"key": "selbstvertrauen", "name_de": "Selbstvertrauen", "name_en": "Self-Confidence"}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"taktisch": {
|
||||||
|
"name_de": "Taktische Fähigkeiten",
|
||||||
|
"name_en": "Tactical Abilities",
|
||||||
|
"icon": "♟️",
|
||||||
|
"abilities": [
|
||||||
|
{"key": "timing", "name_de": "Timing", "name_en": "Timing"},
|
||||||
|
{"key": "strategie", "name_de": "Strategie", "name_en": "Strategy"},
|
||||||
|
{"key": "antizipation", "name_de": "Antizipation", "name_en": "Anticipation"},
|
||||||
|
{"key": "situationsanalyse", "name_de": "Situationsanalyse", "name_en": "Situation Analysis"}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return taxonomy
|
||||||
|
|
||||||
|
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
# TRAINING TYPE PROFILES - Phase 2 (#15)
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
@router.get("/profiles/templates")
|
||||||
|
def list_profile_templates(session: dict = Depends(require_admin)):
|
||||||
|
"""
|
||||||
|
List all available profile templates.
|
||||||
|
|
||||||
|
Returns templates for common training types (Running, Meditation, Strength, etc.)
|
||||||
|
"""
|
||||||
|
return list_templates()
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/profiles/templates/{template_key}")
|
||||||
|
def get_profile_template(template_key: str, session: dict = Depends(require_admin)):
|
||||||
|
"""
|
||||||
|
Get a specific profile template by key.
|
||||||
|
|
||||||
|
Keys: running, meditation, strength
|
||||||
|
"""
|
||||||
|
template = get_template(template_key)
|
||||||
|
if not template:
|
||||||
|
raise HTTPException(404, f"Template '{template_key}' not found")
|
||||||
|
return template
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/{type_id}/profile/apply-template")
|
||||||
|
def apply_profile_template(
|
||||||
|
type_id: int,
|
||||||
|
data: dict,
|
||||||
|
session: dict = Depends(require_admin)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Apply a profile template to a training type.
|
||||||
|
|
||||||
|
Body: { "template_key": "running" }
|
||||||
|
"""
|
||||||
|
template_key = data.get("template_key")
|
||||||
|
if not template_key:
|
||||||
|
raise HTTPException(400, "template_key required")
|
||||||
|
|
||||||
|
template = get_template(template_key)
|
||||||
|
if not template:
|
||||||
|
raise HTTPException(404, f"Template '{template_key}' not found")
|
||||||
|
|
||||||
|
# Apply template to training type
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Check if training type exists
|
||||||
|
cur.execute("SELECT id, name_de FROM training_types WHERE id = %s", (type_id,))
|
||||||
|
training_type = cur.fetchone()
|
||||||
|
if not training_type:
|
||||||
|
raise HTTPException(404, "Training type not found")
|
||||||
|
|
||||||
|
# Update profile
|
||||||
|
cur.execute("""
|
||||||
|
UPDATE training_types
|
||||||
|
SET profile = %s
|
||||||
|
WHERE id = %s
|
||||||
|
""", (Json(template), type_id))
|
||||||
|
|
||||||
|
logger.info(f"[ADMIN] Applied template '{template_key}' to training type {type_id} ({training_type['name_de']})")
|
||||||
|
|
||||||
|
return {
|
||||||
|
"message": f"Template '{template_key}' applied successfully",
|
||||||
|
"training_type_id": type_id,
|
||||||
|
"training_type_name": training_type['name_de'],
|
||||||
|
"template_key": template_key
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/profiles/stats")
|
||||||
|
def get_profile_stats(session: dict = Depends(require_admin)):
|
||||||
|
"""
|
||||||
|
Get statistics about configured profiles.
|
||||||
|
|
||||||
|
Returns count of training types with/without profiles.
|
||||||
|
"""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
cur.execute("""
|
||||||
|
SELECT
|
||||||
|
COUNT(*) as total,
|
||||||
|
COUNT(profile) as configured,
|
||||||
|
COUNT(*) - COUNT(profile) as unconfigured
|
||||||
|
FROM training_types
|
||||||
|
""")
|
||||||
|
stats = cur.fetchone()
|
||||||
|
|
||||||
|
# Get list of types with profiles
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id, name_de, category, subcategory
|
||||||
|
FROM training_types
|
||||||
|
WHERE profile IS NOT NULL
|
||||||
|
ORDER BY name_de
|
||||||
|
""")
|
||||||
|
configured_types = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
# Get list of types without profiles
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id, name_de, category, subcategory
|
||||||
|
FROM training_types
|
||||||
|
WHERE profile IS NULL
|
||||||
|
ORDER BY name_de
|
||||||
|
""")
|
||||||
|
unconfigured_types = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
return {
|
||||||
|
"total": stats['total'],
|
||||||
|
"configured": stats['configured'],
|
||||||
|
"unconfigured": stats['unconfigured'],
|
||||||
|
"configured_types": configured_types,
|
||||||
|
"unconfigured_types": unconfigured_types
|
||||||
|
}
|
||||||
398
backend/routers/auth.py
Normal file
398
backend/routers/auth.py
Normal file
|
|
@ -0,0 +1,398 @@
|
||||||
|
"""
|
||||||
|
Authentication Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
Handles login, logout, password reset, and profile authentication.
|
||||||
|
"""
|
||||||
|
import os
|
||||||
|
import secrets
|
||||||
|
import smtplib
|
||||||
|
from typing import Optional
|
||||||
|
from datetime import datetime, timedelta, timezone
|
||||||
|
from email.mime.text import MIMEText
|
||||||
|
|
||||||
|
from fastapi import APIRouter, HTTPException, Header, Depends
|
||||||
|
from starlette.requests import Request
|
||||||
|
from slowapi import Limiter
|
||||||
|
from slowapi.util import get_remote_address
|
||||||
|
|
||||||
|
from db import get_db, get_cursor
|
||||||
|
from auth import hash_pin, verify_pin, make_token, require_auth
|
||||||
|
from models import LoginRequest, PasswordResetRequest, PasswordResetConfirm, RegisterRequest
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/auth", tags=["auth"])
|
||||||
|
limiter = Limiter(key_func=get_remote_address)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/login")
|
||||||
|
@limiter.limit("5/minute")
|
||||||
|
async def login(req: LoginRequest, request: Request):
|
||||||
|
"""Login with email + password."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT * FROM profiles WHERE email=%s", (req.email.lower().strip(),))
|
||||||
|
prof = cur.fetchone()
|
||||||
|
if not prof:
|
||||||
|
raise HTTPException(401, "Ungültige Zugangsdaten")
|
||||||
|
|
||||||
|
# Verify password
|
||||||
|
if not verify_pin(req.password, prof['pin_hash']):
|
||||||
|
raise HTTPException(401, "Ungültige Zugangsdaten")
|
||||||
|
|
||||||
|
# Auto-upgrade from SHA256 to bcrypt
|
||||||
|
if prof['pin_hash'] and not prof['pin_hash'].startswith('$2'):
|
||||||
|
new_hash = hash_pin(req.password)
|
||||||
|
cur.execute("UPDATE profiles SET pin_hash=%s WHERE id=%s", (new_hash, prof['id']))
|
||||||
|
|
||||||
|
# Create session
|
||||||
|
token = make_token()
|
||||||
|
session_days = prof.get('session_days', 30)
|
||||||
|
expires = datetime.now() + timedelta(days=session_days)
|
||||||
|
cur.execute("INSERT INTO sessions (token, profile_id, expires_at, created) VALUES (%s,%s,%s,CURRENT_TIMESTAMP)",
|
||||||
|
(token, prof['id'], expires.isoformat()))
|
||||||
|
|
||||||
|
return {
|
||||||
|
"token": token,
|
||||||
|
"profile_id": prof['id'],
|
||||||
|
"name": prof['name'],
|
||||||
|
"role": prof['role'],
|
||||||
|
"expires_at": expires.isoformat()
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/logout")
|
||||||
|
def logout(x_auth_token: Optional[str]=Header(default=None)):
|
||||||
|
"""Logout (delete session)."""
|
||||||
|
if x_auth_token:
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("DELETE FROM sessions WHERE token=%s", (x_auth_token,))
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/me")
|
||||||
|
def get_me(session: dict=Depends(require_auth)):
|
||||||
|
"""Get current user info."""
|
||||||
|
pid = session['profile_id']
|
||||||
|
# Import here to avoid circular dependency
|
||||||
|
from routers.profiles import get_profile
|
||||||
|
return get_profile(pid, session)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/status")
|
||||||
|
def auth_status():
|
||||||
|
"""Health check endpoint."""
|
||||||
|
return {"status": "ok", "service": "mitai-jinkendo", "version": "v9b"}
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/pin")
|
||||||
|
def change_pin(req: dict, session: dict=Depends(require_auth)):
|
||||||
|
"""Change PIN/password for current user."""
|
||||||
|
pid = session['profile_id']
|
||||||
|
new_pin = req.get('pin', '')
|
||||||
|
if len(new_pin) < 4:
|
||||||
|
raise HTTPException(400, "PIN/Passwort muss mind. 4 Zeichen haben")
|
||||||
|
|
||||||
|
new_hash = hash_pin(new_pin)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("UPDATE profiles SET pin_hash=%s WHERE id=%s", (new_hash, pid))
|
||||||
|
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/forgot-password")
|
||||||
|
@limiter.limit("3/minute")
|
||||||
|
async def password_reset_request(req: PasswordResetRequest, request: Request):
|
||||||
|
"""Request password reset email."""
|
||||||
|
email = req.email.lower().strip()
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT id, name FROM profiles WHERE email=%s", (email,))
|
||||||
|
prof = cur.fetchone()
|
||||||
|
if not prof:
|
||||||
|
# Don't reveal if email exists
|
||||||
|
return {"ok": True, "message": "Falls die E-Mail existiert, wurde ein Reset-Link gesendet."}
|
||||||
|
|
||||||
|
# Generate reset token
|
||||||
|
token = secrets.token_urlsafe(32)
|
||||||
|
expires = datetime.now() + timedelta(hours=1)
|
||||||
|
|
||||||
|
# Store in sessions table (reuse mechanism)
|
||||||
|
cur.execute("INSERT INTO sessions (token, profile_id, expires_at, created) VALUES (%s,%s,%s,CURRENT_TIMESTAMP)",
|
||||||
|
(f"reset_{token}", prof['id'], expires.isoformat()))
|
||||||
|
|
||||||
|
# Send email
|
||||||
|
try:
|
||||||
|
smtp_host = os.getenv("SMTP_HOST")
|
||||||
|
smtp_port = int(os.getenv("SMTP_PORT", 587))
|
||||||
|
smtp_user = os.getenv("SMTP_USER")
|
||||||
|
smtp_pass = os.getenv("SMTP_PASS")
|
||||||
|
smtp_from = os.getenv("SMTP_FROM")
|
||||||
|
app_url = os.getenv("APP_URL", "https://mitai.jinkendo.de")
|
||||||
|
|
||||||
|
if smtp_host and smtp_user and smtp_pass:
|
||||||
|
msg = MIMEText(f"""Hallo {prof['name']},
|
||||||
|
|
||||||
|
Du hast einen Passwort-Reset angefordert.
|
||||||
|
|
||||||
|
Reset-Link: {app_url}/reset-password?token={token}
|
||||||
|
|
||||||
|
Der Link ist 1 Stunde gültig.
|
||||||
|
|
||||||
|
Falls du diese Anfrage nicht gestellt hast, ignoriere diese E-Mail.
|
||||||
|
|
||||||
|
Dein Mitai Jinkendo Team
|
||||||
|
""")
|
||||||
|
msg['Subject'] = "Passwort zurücksetzen – Mitai Jinkendo"
|
||||||
|
msg['From'] = smtp_from
|
||||||
|
msg['To'] = email
|
||||||
|
|
||||||
|
with smtplib.SMTP(smtp_host, smtp_port) as server:
|
||||||
|
server.starttls()
|
||||||
|
server.login(smtp_user, smtp_pass)
|
||||||
|
server.send_message(msg)
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Email error: {e}")
|
||||||
|
|
||||||
|
return {"ok": True, "message": "Falls die E-Mail existiert, wurde ein Reset-Link gesendet."}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/reset-password")
|
||||||
|
def password_reset_confirm(req: PasswordResetConfirm):
|
||||||
|
"""Confirm password reset with token."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT profile_id FROM sessions WHERE token=%s AND expires_at > CURRENT_TIMESTAMP",
|
||||||
|
(f"reset_{req.token}",))
|
||||||
|
sess = cur.fetchone()
|
||||||
|
if not sess:
|
||||||
|
raise HTTPException(400, "Ungültiger oder abgelaufener Reset-Link")
|
||||||
|
|
||||||
|
pid = sess['profile_id']
|
||||||
|
new_hash = hash_pin(req.new_password)
|
||||||
|
cur.execute("UPDATE profiles SET pin_hash=%s WHERE id=%s", (new_hash, pid))
|
||||||
|
cur.execute("DELETE FROM sessions WHERE token=%s", (f"reset_{req.token}",))
|
||||||
|
|
||||||
|
return {"ok": True, "message": "Passwort erfolgreich zurückgesetzt"}
|
||||||
|
|
||||||
|
|
||||||
|
# ── Helper: Send Email ────────────────────────────────────────────────────────
|
||||||
|
def send_email(to_email: str, subject: str, body: str):
|
||||||
|
"""Send email via SMTP (reusable helper)."""
|
||||||
|
try:
|
||||||
|
smtp_host = os.getenv("SMTP_HOST")
|
||||||
|
smtp_port = int(os.getenv("SMTP_PORT", 587))
|
||||||
|
smtp_user = os.getenv("SMTP_USER")
|
||||||
|
smtp_pass = os.getenv("SMTP_PASS")
|
||||||
|
smtp_from = os.getenv("SMTP_FROM", "noreply@jinkendo.de")
|
||||||
|
|
||||||
|
if not smtp_host or not smtp_user or not smtp_pass:
|
||||||
|
print("SMTP not configured, skipping email")
|
||||||
|
return False
|
||||||
|
|
||||||
|
msg = MIMEText(body)
|
||||||
|
msg['Subject'] = subject
|
||||||
|
msg['From'] = smtp_from
|
||||||
|
msg['To'] = to_email
|
||||||
|
|
||||||
|
with smtplib.SMTP(smtp_host, smtp_port) as server:
|
||||||
|
server.starttls()
|
||||||
|
server.login(smtp_user, smtp_pass)
|
||||||
|
server.send_message(msg)
|
||||||
|
|
||||||
|
return True
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Email error: {e}")
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
# ── Registration Endpoints ────────────────────────────────────────────────────
|
||||||
|
@router.post("/register")
|
||||||
|
@limiter.limit("3/hour")
|
||||||
|
async def register(req: RegisterRequest, request: Request):
|
||||||
|
"""Self-registration with email verification."""
|
||||||
|
email = req.email.lower().strip()
|
||||||
|
name = req.name.strip()
|
||||||
|
password = req.password
|
||||||
|
|
||||||
|
# Validation
|
||||||
|
if not email or '@' not in email:
|
||||||
|
raise HTTPException(400, "Ungültige E-Mail-Adresse")
|
||||||
|
if len(password) < 8:
|
||||||
|
raise HTTPException(400, "Passwort muss mindestens 8 Zeichen lang sein")
|
||||||
|
if not name or len(name) < 2:
|
||||||
|
raise HTTPException(400, "Name muss mindestens 2 Zeichen lang sein")
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Check if email already exists
|
||||||
|
cur.execute("SELECT id FROM profiles WHERE email=%s", (email,))
|
||||||
|
if cur.fetchone():
|
||||||
|
raise HTTPException(400, "E-Mail-Adresse bereits registriert")
|
||||||
|
|
||||||
|
# Generate verification token
|
||||||
|
verification_token = secrets.token_urlsafe(32)
|
||||||
|
verification_expires = datetime.now(timezone.utc) + timedelta(hours=24)
|
||||||
|
|
||||||
|
# Create profile (inactive until verified)
|
||||||
|
profile_id = str(secrets.token_hex(16))
|
||||||
|
pin_hash = hash_pin(password)
|
||||||
|
trial_ends = datetime.now(timezone.utc) + timedelta(days=14) # 14-day trial
|
||||||
|
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO profiles (
|
||||||
|
id, name, email, pin_hash, auth_type, role, tier,
|
||||||
|
email_verified, verification_token, verification_expires,
|
||||||
|
trial_ends_at, created
|
||||||
|
) VALUES (%s, %s, %s, %s, 'email', 'user', 'free', FALSE, %s, %s, %s, CURRENT_TIMESTAMP)
|
||||||
|
""", (profile_id, name, email, pin_hash, verification_token, verification_expires, trial_ends))
|
||||||
|
|
||||||
|
# Send verification email
|
||||||
|
app_url = os.getenv("APP_URL", "https://mitai.jinkendo.de")
|
||||||
|
verify_url = f"{app_url}/verify?token={verification_token}"
|
||||||
|
|
||||||
|
email_body = f"""Hallo {name},
|
||||||
|
|
||||||
|
willkommen bei Mitai Jinkendo!
|
||||||
|
|
||||||
|
Bitte bestätige deine E-Mail-Adresse um die Registrierung abzuschließen:
|
||||||
|
|
||||||
|
{verify_url}
|
||||||
|
|
||||||
|
Der Link ist 24 Stunden gültig.
|
||||||
|
|
||||||
|
Dein Mitai Jinkendo Team
|
||||||
|
"""
|
||||||
|
|
||||||
|
send_email(email, "Willkommen bei Mitai Jinkendo – E-Mail bestätigen", email_body)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"ok": True,
|
||||||
|
"message": "Registrierung erfolgreich! Bitte prüfe dein E-Mail-Postfach und bestätige deine E-Mail-Adresse."
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/verify/{token}")
|
||||||
|
async def verify_email(token: str):
|
||||||
|
"""Verify email address and activate account."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Find profile with this verification token
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id, name, email, email_verified, verification_expires
|
||||||
|
FROM profiles
|
||||||
|
WHERE verification_token=%s
|
||||||
|
""", (token,))
|
||||||
|
|
||||||
|
prof = cur.fetchone()
|
||||||
|
|
||||||
|
if not prof:
|
||||||
|
# Token not found - might be already used/verified
|
||||||
|
# Check if there's a verified profile (token was deleted after verification)
|
||||||
|
raise HTTPException(400, "Verifikations-Link ungültig oder bereits verwendet. Falls du bereits verifiziert bist, melde dich einfach an.")
|
||||||
|
|
||||||
|
if prof['email_verified']:
|
||||||
|
raise HTTPException(400, "E-Mail-Adresse bereits bestätigt")
|
||||||
|
|
||||||
|
# Check if token expired
|
||||||
|
if prof['verification_expires'] and datetime.now(timezone.utc) > prof['verification_expires']:
|
||||||
|
raise HTTPException(400, "Verifikations-Link abgelaufen. Bitte registriere dich erneut.")
|
||||||
|
|
||||||
|
# Mark as verified and clear token
|
||||||
|
cur.execute("""
|
||||||
|
UPDATE profiles
|
||||||
|
SET email_verified=TRUE, verification_token=NULL, verification_expires=NULL
|
||||||
|
WHERE id=%s
|
||||||
|
""", (prof['id'],))
|
||||||
|
|
||||||
|
# Create session (auto-login after verification)
|
||||||
|
session_token = make_token()
|
||||||
|
expires = datetime.now(timezone.utc) + timedelta(days=30)
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO sessions (token, profile_id, expires_at, created)
|
||||||
|
VALUES (%s, %s, %s, CURRENT_TIMESTAMP)
|
||||||
|
""", (session_token, prof['id'], expires))
|
||||||
|
|
||||||
|
return {
|
||||||
|
"ok": True,
|
||||||
|
"message": "E-Mail-Adresse erfolgreich bestätigt!",
|
||||||
|
"token": session_token,
|
||||||
|
"profile": {
|
||||||
|
"id": prof['id'],
|
||||||
|
"name": prof['name'],
|
||||||
|
"email": prof['email']
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/resend-verification")
|
||||||
|
@limiter.limit("3/hour")
|
||||||
|
async def resend_verification(req: dict, request: Request):
|
||||||
|
"""Resend verification email for unverified account."""
|
||||||
|
email = req.get('email', '').strip().lower()
|
||||||
|
|
||||||
|
if not email:
|
||||||
|
raise HTTPException(400, "E-Mail-Adresse erforderlich")
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Find profile by email
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id, name, email, email_verified, verification_token, verification_expires
|
||||||
|
FROM profiles
|
||||||
|
WHERE email=%s
|
||||||
|
""", (email,))
|
||||||
|
|
||||||
|
prof = cur.fetchone()
|
||||||
|
|
||||||
|
if not prof:
|
||||||
|
# Don't leak info about existing emails
|
||||||
|
return {"ok": True, "message": "Falls ein Account mit dieser E-Mail existiert, wurde eine Bestätigungs-E-Mail versendet."}
|
||||||
|
|
||||||
|
if prof['email_verified']:
|
||||||
|
raise HTTPException(400, "E-Mail-Adresse bereits bestätigt")
|
||||||
|
|
||||||
|
# Generate new verification token
|
||||||
|
verification_token = secrets.token_urlsafe(32)
|
||||||
|
verification_expires = datetime.now(timezone.utc) + timedelta(hours=24)
|
||||||
|
|
||||||
|
cur.execute("""
|
||||||
|
UPDATE profiles
|
||||||
|
SET verification_token=%s, verification_expires=%s
|
||||||
|
WHERE id=%s
|
||||||
|
""", (verification_token, verification_expires, prof['id']))
|
||||||
|
|
||||||
|
# Send verification email
|
||||||
|
app_url = os.getenv("APP_URL", "https://mitai.jinkendo.de")
|
||||||
|
verify_url = f"{app_url}/verify?token={verification_token}"
|
||||||
|
|
||||||
|
email_body = f"""Hallo {prof['name']},
|
||||||
|
|
||||||
|
du hast eine neue Bestätigungs-E-Mail angefordert.
|
||||||
|
|
||||||
|
Bitte bestätige deine E-Mail-Adresse, indem du auf folgenden Link klickst:
|
||||||
|
|
||||||
|
{verify_url}
|
||||||
|
|
||||||
|
Dieser Link ist 24 Stunden gültig.
|
||||||
|
|
||||||
|
Falls du diese E-Mail nicht angefordert hast, kannst du sie einfach ignorieren.
|
||||||
|
|
||||||
|
Viele Grüße
|
||||||
|
Dein Mitai Jinkendo Team
|
||||||
|
"""
|
||||||
|
|
||||||
|
try:
|
||||||
|
send_email(
|
||||||
|
to=email,
|
||||||
|
subject="Neue Bestätigungs-E-Mail - Mitai Jinkendo",
|
||||||
|
body=email_body
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Failed to send verification email: {e}")
|
||||||
|
raise HTTPException(500, "E-Mail konnte nicht versendet werden")
|
||||||
|
|
||||||
|
return {"ok": True, "message": "Bestätigungs-E-Mail wurde erneut versendet."}
|
||||||
412
backend/routers/blood_pressure.py
Normal file
412
backend/routers/blood_pressure.py
Normal file
|
|
@ -0,0 +1,412 @@
|
||||||
|
"""
|
||||||
|
Blood Pressure Router - v9d Phase 2d Refactored
|
||||||
|
|
||||||
|
Context-dependent blood pressure measurements (multiple times per day):
|
||||||
|
- Systolic/Diastolic Blood Pressure
|
||||||
|
- Pulse during measurement
|
||||||
|
- Context tagging (morning_fasted, after_meal, before_training, etc.)
|
||||||
|
- Warning flags (irregular heartbeat, AFib)
|
||||||
|
|
||||||
|
Endpoints:
|
||||||
|
- GET /api/blood-pressure List BP measurements
|
||||||
|
- GET /api/blood-pressure/by-date/{date} Get measurements for specific date
|
||||||
|
- POST /api/blood-pressure Create BP measurement
|
||||||
|
- PUT /api/blood-pressure/{id} Update BP measurement
|
||||||
|
- DELETE /api/blood-pressure/{id} Delete BP measurement
|
||||||
|
- GET /api/blood-pressure/stats Statistics and trends
|
||||||
|
- POST /api/blood-pressure/import/omron Import Omron CSV
|
||||||
|
"""
|
||||||
|
from fastapi import APIRouter, HTTPException, Depends, Header, UploadFile, File
|
||||||
|
from pydantic import BaseModel
|
||||||
|
from typing import Optional
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
import logging
|
||||||
|
import csv
|
||||||
|
import io
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_auth
|
||||||
|
from routers.profiles import get_pid
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/blood-pressure", tags=["blood_pressure"])
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
# German month mapping for Omron dates
|
||||||
|
GERMAN_MONTHS = {
|
||||||
|
'Januar': '01', 'Jan.': '01', 'Jan': '01',
|
||||||
|
'Februar': '02', 'Feb.': '02', 'Feb': '02',
|
||||||
|
'März': '03', 'Mär.': '03', 'Mär': '03',
|
||||||
|
'April': '04', 'Apr.': '04', 'Apr': '04',
|
||||||
|
'Mai': '05',
|
||||||
|
'Juni': '06', 'Jun.': '06', 'Jun': '06',
|
||||||
|
'Juli': '07', 'Jul.': '07', 'Jul': '07',
|
||||||
|
'August': '08', 'Aug.': '08', 'Aug': '08',
|
||||||
|
'September': '09', 'Sep.': '09', 'Sep': '09',
|
||||||
|
'Oktober': '10', 'Okt.': '10', 'Okt': '10',
|
||||||
|
'November': '11', 'Nov.': '11', 'Nov': '11',
|
||||||
|
'Dezember': '12', 'Dez.': '12', 'Dez': '12',
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
# Pydantic Models
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
class BPEntry(BaseModel):
|
||||||
|
measured_at: str # ISO format datetime
|
||||||
|
systolic: int
|
||||||
|
diastolic: int
|
||||||
|
pulse: Optional[int] = None
|
||||||
|
context: Optional[str] = None # morning_fasted, after_meal, etc.
|
||||||
|
irregular_heartbeat: Optional[bool] = False
|
||||||
|
possible_afib: Optional[bool] = False
|
||||||
|
note: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
# Helper Functions
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
def parse_omron_date(date_str: str, time_str: str) -> str:
|
||||||
|
"""
|
||||||
|
Parse Omron German date/time format to ISO datetime.
|
||||||
|
Input: "13 März 2026", "08:30"
|
||||||
|
Output: "2026-03-13 08:30:00"
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
parts = date_str.strip().split()
|
||||||
|
if len(parts) != 3:
|
||||||
|
return None
|
||||||
|
|
||||||
|
day = parts[0]
|
||||||
|
month_name = parts[1]
|
||||||
|
year = parts[2]
|
||||||
|
|
||||||
|
month = GERMAN_MONTHS.get(month_name)
|
||||||
|
if not month:
|
||||||
|
return None
|
||||||
|
|
||||||
|
iso_date = f"{year}-{month}-{day.zfill(2)}"
|
||||||
|
iso_datetime = f"{iso_date} {time_str}:00"
|
||||||
|
|
||||||
|
# Validate
|
||||||
|
datetime.fromisoformat(iso_datetime)
|
||||||
|
return iso_datetime
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error parsing Omron date: {date_str} {time_str} - {e}")
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
# CRUD Endpoints
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def list_bp_measurements(
|
||||||
|
limit: int = 90,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Get blood pressure measurements (last N entries)."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT * FROM blood_pressure_log
|
||||||
|
WHERE profile_id = %s
|
||||||
|
ORDER BY measured_at DESC
|
||||||
|
LIMIT %s
|
||||||
|
""", (pid, limit))
|
||||||
|
return [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/by-date/{date}")
|
||||||
|
def get_bp_by_date(
|
||||||
|
date: str,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Get all BP measurements for a specific date."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT * FROM blood_pressure_log
|
||||||
|
WHERE profile_id = %s
|
||||||
|
AND DATE(measured_at) = %s
|
||||||
|
ORDER BY measured_at ASC
|
||||||
|
""", (pid, date))
|
||||||
|
return [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("")
|
||||||
|
def create_bp_measurement(
|
||||||
|
entry: BPEntry,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Create new BP measurement."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO blood_pressure_log (
|
||||||
|
profile_id, measured_at,
|
||||||
|
systolic, diastolic, pulse,
|
||||||
|
context, irregular_heartbeat, possible_afib,
|
||||||
|
note, source
|
||||||
|
) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, 'manual')
|
||||||
|
RETURNING *
|
||||||
|
""", (
|
||||||
|
pid, entry.measured_at,
|
||||||
|
entry.systolic, entry.diastolic, entry.pulse,
|
||||||
|
entry.context, entry.irregular_heartbeat, entry.possible_afib,
|
||||||
|
entry.note
|
||||||
|
))
|
||||||
|
return r2d(cur.fetchone())
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{entry_id}")
|
||||||
|
def update_bp_measurement(
|
||||||
|
entry_id: int,
|
||||||
|
entry: BPEntry,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Update existing BP measurement."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
UPDATE blood_pressure_log
|
||||||
|
SET measured_at = %s,
|
||||||
|
systolic = %s,
|
||||||
|
diastolic = %s,
|
||||||
|
pulse = %s,
|
||||||
|
context = %s,
|
||||||
|
irregular_heartbeat = %s,
|
||||||
|
possible_afib = %s,
|
||||||
|
note = %s
|
||||||
|
WHERE id = %s AND profile_id = %s
|
||||||
|
RETURNING *
|
||||||
|
""", (
|
||||||
|
entry.measured_at,
|
||||||
|
entry.systolic, entry.diastolic, entry.pulse,
|
||||||
|
entry.context, entry.irregular_heartbeat, entry.possible_afib,
|
||||||
|
entry.note,
|
||||||
|
entry_id, pid
|
||||||
|
))
|
||||||
|
row = cur.fetchone()
|
||||||
|
if not row:
|
||||||
|
raise HTTPException(404, "Entry not found")
|
||||||
|
return r2d(row)
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{entry_id}")
|
||||||
|
def delete_bp_measurement(
|
||||||
|
entry_id: int,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Delete BP measurement."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
DELETE FROM blood_pressure_log
|
||||||
|
WHERE id = %s AND profile_id = %s
|
||||||
|
""", (entry_id, pid))
|
||||||
|
if cur.rowcount == 0:
|
||||||
|
raise HTTPException(404, "Entry not found")
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
# Statistics & Trends
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
@router.get("/stats")
|
||||||
|
def get_bp_stats(
|
||||||
|
days: int = 30,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Get blood pressure statistics and trends."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
cutoff_date = datetime.now() - timedelta(days=days)
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT
|
||||||
|
COUNT(*) as total_measurements,
|
||||||
|
-- Overall averages
|
||||||
|
AVG(systolic) as avg_systolic,
|
||||||
|
AVG(diastolic) as avg_diastolic,
|
||||||
|
AVG(pulse) FILTER (WHERE pulse IS NOT NULL) as avg_pulse,
|
||||||
|
-- 7-day averages
|
||||||
|
AVG(systolic) FILTER (WHERE measured_at >= NOW() - INTERVAL '7 days') as avg_systolic_7d,
|
||||||
|
AVG(diastolic) FILTER (WHERE measured_at >= NOW() - INTERVAL '7 days') as avg_diastolic_7d,
|
||||||
|
-- Context-specific averages
|
||||||
|
AVG(systolic) FILTER (WHERE context = 'morning_fasted') as avg_systolic_morning,
|
||||||
|
AVG(diastolic) FILTER (WHERE context = 'morning_fasted') as avg_diastolic_morning,
|
||||||
|
AVG(systolic) FILTER (WHERE context = 'evening') as avg_systolic_evening,
|
||||||
|
AVG(diastolic) FILTER (WHERE context = 'evening') as avg_diastolic_evening,
|
||||||
|
-- Warning flags
|
||||||
|
COUNT(*) FILTER (WHERE irregular_heartbeat = true) as irregular_count,
|
||||||
|
COUNT(*) FILTER (WHERE possible_afib = true) as afib_count
|
||||||
|
FROM blood_pressure_log
|
||||||
|
WHERE profile_id = %s AND measured_at >= %s
|
||||||
|
""", (pid, cutoff_date))
|
||||||
|
|
||||||
|
stats = r2d(cur.fetchone())
|
||||||
|
|
||||||
|
# Classify BP ranges (WHO/ISH guidelines)
|
||||||
|
if stats['avg_systolic'] and stats['avg_diastolic']:
|
||||||
|
if stats['avg_systolic'] < 120 and stats['avg_diastolic'] < 80:
|
||||||
|
stats['bp_category'] = 'optimal'
|
||||||
|
elif stats['avg_systolic'] < 130 and stats['avg_diastolic'] < 85:
|
||||||
|
stats['bp_category'] = 'normal'
|
||||||
|
elif stats['avg_systolic'] < 140 and stats['avg_diastolic'] < 90:
|
||||||
|
stats['bp_category'] = 'high_normal'
|
||||||
|
elif stats['avg_systolic'] < 160 and stats['avg_diastolic'] < 100:
|
||||||
|
stats['bp_category'] = 'grade_1_hypertension'
|
||||||
|
elif stats['avg_systolic'] < 180 and stats['avg_diastolic'] < 110:
|
||||||
|
stats['bp_category'] = 'grade_2_hypertension'
|
||||||
|
else:
|
||||||
|
stats['bp_category'] = 'grade_3_hypertension'
|
||||||
|
else:
|
||||||
|
stats['bp_category'] = None
|
||||||
|
|
||||||
|
return stats
|
||||||
|
|
||||||
|
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
# Import: Omron CSV
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
@router.post("/import/omron")
|
||||||
|
async def import_omron_csv(
|
||||||
|
file: UploadFile = File(...),
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Import blood pressure measurements from Omron CSV export."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
content = await file.read()
|
||||||
|
decoded = content.decode('utf-8')
|
||||||
|
reader = csv.DictReader(io.StringIO(decoded))
|
||||||
|
|
||||||
|
inserted = 0
|
||||||
|
updated = 0
|
||||||
|
skipped = 0
|
||||||
|
errors = 0
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Log available columns for debugging
|
||||||
|
first_row = True
|
||||||
|
|
||||||
|
for row in reader:
|
||||||
|
try:
|
||||||
|
if first_row:
|
||||||
|
logger.info(f"Omron CSV Columns: {list(row.keys())}")
|
||||||
|
first_row = False
|
||||||
|
|
||||||
|
# Parse Omron German date format
|
||||||
|
date_str = row.get('Datum', row.get('Date'))
|
||||||
|
time_str = row.get('Zeit', row.get('Time', '08:00'))
|
||||||
|
|
||||||
|
if not date_str:
|
||||||
|
skipped += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
measured_at = parse_omron_date(date_str, time_str)
|
||||||
|
if not measured_at:
|
||||||
|
errors += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Extract measurements (support column names with/without units)
|
||||||
|
systolic = (row.get('Systolisch (mmHg)') or row.get('Systolisch') or
|
||||||
|
row.get('Systolic (mmHg)') or row.get('Systolic'))
|
||||||
|
diastolic = (row.get('Diastolisch (mmHg)') or row.get('Diastolisch') or
|
||||||
|
row.get('Diastolic (mmHg)') or row.get('Diastolic'))
|
||||||
|
pulse = (row.get('Puls (bpm)') or row.get('Puls') or
|
||||||
|
row.get('Pulse (bpm)') or row.get('Pulse'))
|
||||||
|
|
||||||
|
if not systolic or not diastolic:
|
||||||
|
logger.warning(f"Skipped row {date_str} {time_str}: Missing BP values (sys={systolic}, dia={diastolic})")
|
||||||
|
skipped += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Parse warning flags (support various column names)
|
||||||
|
irregular = (row.get('Unregelmäßiger Herzschlag festgestellt') or
|
||||||
|
row.get('Unregelmäßiger Herzschlag') or
|
||||||
|
row.get('Irregular Heartbeat') or '')
|
||||||
|
afib = (row.get('Mögliches AFib') or
|
||||||
|
row.get('Vorhofflimmern') or
|
||||||
|
row.get('Possible AFib') or
|
||||||
|
row.get('AFib') or '')
|
||||||
|
|
||||||
|
irregular_heartbeat = irregular.lower() in ['ja', 'yes', 'true', '1']
|
||||||
|
possible_afib = afib.lower() in ['ja', 'yes', 'true', '1']
|
||||||
|
|
||||||
|
# Determine context based on time
|
||||||
|
hour = int(time_str.split(':')[0])
|
||||||
|
if 5 <= hour < 10:
|
||||||
|
context = 'morning_fasted'
|
||||||
|
elif 18 <= hour < 23:
|
||||||
|
context = 'evening'
|
||||||
|
else:
|
||||||
|
context = 'other'
|
||||||
|
|
||||||
|
# Upsert
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO blood_pressure_log (
|
||||||
|
profile_id, measured_at,
|
||||||
|
systolic, diastolic, pulse,
|
||||||
|
context, irregular_heartbeat, possible_afib,
|
||||||
|
source
|
||||||
|
) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, 'omron')
|
||||||
|
ON CONFLICT (profile_id, measured_at)
|
||||||
|
DO UPDATE SET
|
||||||
|
systolic = EXCLUDED.systolic,
|
||||||
|
diastolic = EXCLUDED.diastolic,
|
||||||
|
pulse = EXCLUDED.pulse,
|
||||||
|
context = EXCLUDED.context,
|
||||||
|
irregular_heartbeat = EXCLUDED.irregular_heartbeat,
|
||||||
|
possible_afib = EXCLUDED.possible_afib
|
||||||
|
WHERE blood_pressure_log.source != 'manual'
|
||||||
|
RETURNING (xmax = 0) AS inserted
|
||||||
|
""", (
|
||||||
|
pid, measured_at,
|
||||||
|
int(systolic), int(diastolic),
|
||||||
|
int(pulse) if pulse else None,
|
||||||
|
context, irregular_heartbeat, possible_afib
|
||||||
|
))
|
||||||
|
|
||||||
|
result = cur.fetchone()
|
||||||
|
if result is None:
|
||||||
|
# WHERE clause prevented update (manual entry exists)
|
||||||
|
skipped += 1
|
||||||
|
elif result['inserted']:
|
||||||
|
inserted += 1
|
||||||
|
else:
|
||||||
|
updated += 1
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error importing Omron row: {e}")
|
||||||
|
errors += 1
|
||||||
|
|
||||||
|
return {
|
||||||
|
"inserted": inserted,
|
||||||
|
"updated": updated,
|
||||||
|
"skipped": skipped,
|
||||||
|
"errors": errors
|
||||||
|
}
|
||||||
102
backend/routers/caliper.py
Normal file
102
backend/routers/caliper.py
Normal file
|
|
@ -0,0 +1,102 @@
|
||||||
|
"""
|
||||||
|
Caliper/Skinfold Tracking Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
Handles body fat measurements via skinfold caliper (4 methods supported).
|
||||||
|
"""
|
||||||
|
import uuid
|
||||||
|
import logging
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Header, Depends, HTTPException
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_auth, check_feature_access, increment_feature_usage
|
||||||
|
from models import CaliperEntry
|
||||||
|
from routers.profiles import get_pid
|
||||||
|
from feature_logger import log_feature_usage
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/caliper", tags=["caliper"])
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def list_caliper(limit: int=100, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Get caliper entries for current profile."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"SELECT * FROM caliper_log WHERE profile_id=%s ORDER BY date DESC LIMIT %s", (pid,limit))
|
||||||
|
return [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("")
|
||||||
|
def upsert_caliper(e: CaliperEntry, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Create or update caliper entry (upsert by date)."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
# Phase 4: Check feature access and ENFORCE
|
||||||
|
access = check_feature_access(pid, 'caliper_entries')
|
||||||
|
log_feature_usage(pid, 'caliper_entries', access, 'create')
|
||||||
|
|
||||||
|
if not access['allowed']:
|
||||||
|
logger.warning(
|
||||||
|
f"[FEATURE-LIMIT] User {pid} blocked: "
|
||||||
|
f"caliper_entries {access['reason']} (used: {access['used']}, limit: {access['limit']})"
|
||||||
|
)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=403,
|
||||||
|
detail=f"Limit erreicht: Du hast das Kontingent für Caliper-Einträge überschritten ({access['used']}/{access['limit']}). "
|
||||||
|
f"Bitte kontaktiere den Admin oder warte bis zum nächsten Reset."
|
||||||
|
)
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT id FROM caliper_log WHERE profile_id=%s AND date=%s", (pid,e.date))
|
||||||
|
ex = cur.fetchone()
|
||||||
|
d = e.model_dump()
|
||||||
|
is_new_entry = not ex
|
||||||
|
|
||||||
|
if ex:
|
||||||
|
# UPDATE existing entry
|
||||||
|
eid = ex['id']
|
||||||
|
sets = ', '.join(f"{k}=%s" for k in d if k!='date')
|
||||||
|
cur.execute(f"UPDATE caliper_log SET {sets} WHERE id=%s",
|
||||||
|
[v for k,v in d.items() if k!='date']+[eid])
|
||||||
|
else:
|
||||||
|
# INSERT new entry
|
||||||
|
eid = str(uuid.uuid4())
|
||||||
|
cur.execute("""INSERT INTO caliper_log
|
||||||
|
(id,profile_id,date,sf_method,sf_chest,sf_axilla,sf_triceps,sf_subscap,sf_suprailiac,
|
||||||
|
sf_abdomen,sf_thigh,sf_calf_med,sf_lowerback,sf_biceps,body_fat_pct,lean_mass,fat_mass,notes,created)
|
||||||
|
VALUES (%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,CURRENT_TIMESTAMP)""",
|
||||||
|
(eid,pid,d['date'],d['sf_method'],d['sf_chest'],d['sf_axilla'],d['sf_triceps'],
|
||||||
|
d['sf_subscap'],d['sf_suprailiac'],d['sf_abdomen'],d['sf_thigh'],d['sf_calf_med'],
|
||||||
|
d['sf_lowerback'],d['sf_biceps'],d['body_fat_pct'],d['lean_mass'],d['fat_mass'],d['notes']))
|
||||||
|
|
||||||
|
# Phase 2: Increment usage counter (only for new entries)
|
||||||
|
increment_feature_usage(pid, 'caliper_entries')
|
||||||
|
|
||||||
|
return {"id":eid,"date":e.date}
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{eid}")
|
||||||
|
def update_caliper(eid: str, e: CaliperEntry, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Update existing caliper entry."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
d = e.model_dump()
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(f"UPDATE caliper_log SET {', '.join(f'{k}=%s' for k in d)} WHERE id=%s AND profile_id=%s",
|
||||||
|
list(d.values())+[eid,pid])
|
||||||
|
return {"id":eid}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{eid}")
|
||||||
|
def delete_caliper(eid: str, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Delete caliper entry."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("DELETE FROM caliper_log WHERE id=%s AND profile_id=%s", (eid,pid))
|
||||||
|
return {"ok":True}
|
||||||
100
backend/routers/circumference.py
Normal file
100
backend/routers/circumference.py
Normal file
|
|
@ -0,0 +1,100 @@
|
||||||
|
"""
|
||||||
|
Circumference Tracking Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
Handles body circumference measurements (8 measurement points).
|
||||||
|
"""
|
||||||
|
import uuid
|
||||||
|
import logging
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Header, Depends, HTTPException
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_auth, check_feature_access, increment_feature_usage
|
||||||
|
from models import CircumferenceEntry
|
||||||
|
from routers.profiles import get_pid
|
||||||
|
from feature_logger import log_feature_usage
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/circumferences", tags=["circumference"])
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def list_circs(limit: int=100, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Get circumference entries for current profile."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"SELECT * FROM circumference_log WHERE profile_id=%s ORDER BY date DESC LIMIT %s", (pid,limit))
|
||||||
|
return [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("")
|
||||||
|
def upsert_circ(e: CircumferenceEntry, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Create or update circumference entry (upsert by date)."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
# Phase 4: Check feature access and ENFORCE
|
||||||
|
access = check_feature_access(pid, 'circumference_entries')
|
||||||
|
log_feature_usage(pid, 'circumference_entries', access, 'create')
|
||||||
|
|
||||||
|
if not access['allowed']:
|
||||||
|
logger.warning(
|
||||||
|
f"[FEATURE-LIMIT] User {pid} blocked: "
|
||||||
|
f"circumference_entries {access['reason']} (used: {access['used']}, limit: {access['limit']})"
|
||||||
|
)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=403,
|
||||||
|
detail=f"Limit erreicht: Du hast das Kontingent für Umfangs-Einträge überschritten ({access['used']}/{access['limit']}). "
|
||||||
|
f"Bitte kontaktiere den Admin oder warte bis zum nächsten Reset."
|
||||||
|
)
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT id FROM circumference_log WHERE profile_id=%s AND date=%s", (pid,e.date))
|
||||||
|
ex = cur.fetchone()
|
||||||
|
d = e.model_dump()
|
||||||
|
is_new_entry = not ex
|
||||||
|
|
||||||
|
if ex:
|
||||||
|
# UPDATE existing entry
|
||||||
|
eid = ex['id']
|
||||||
|
sets = ', '.join(f"{k}=%s" for k in d if k!='date')
|
||||||
|
cur.execute(f"UPDATE circumference_log SET {sets} WHERE id=%s",
|
||||||
|
[v for k,v in d.items() if k!='date']+[eid])
|
||||||
|
else:
|
||||||
|
# INSERT new entry
|
||||||
|
eid = str(uuid.uuid4())
|
||||||
|
cur.execute("""INSERT INTO circumference_log
|
||||||
|
(id,profile_id,date,c_neck,c_chest,c_waist,c_belly,c_hip,c_thigh,c_calf,c_arm,notes,photo_id,created)
|
||||||
|
VALUES (%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,%s,CURRENT_TIMESTAMP)""",
|
||||||
|
(eid,pid,d['date'],d['c_neck'],d['c_chest'],d['c_waist'],d['c_belly'],
|
||||||
|
d['c_hip'],d['c_thigh'],d['c_calf'],d['c_arm'],d['notes'],d['photo_id']))
|
||||||
|
|
||||||
|
# Phase 2: Increment usage counter (only for new entries)
|
||||||
|
increment_feature_usage(pid, 'circumference_entries')
|
||||||
|
|
||||||
|
return {"id":eid,"date":e.date}
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{eid}")
|
||||||
|
def update_circ(eid: str, e: CircumferenceEntry, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Update existing circumference entry."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
d = e.model_dump()
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(f"UPDATE circumference_log SET {', '.join(f'{k}=%s' for k in d)} WHERE id=%s AND profile_id=%s",
|
||||||
|
list(d.values())+[eid,pid])
|
||||||
|
return {"id":eid}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{eid}")
|
||||||
|
def delete_circ(eid: str, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Delete circumference entry."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("DELETE FROM circumference_log WHERE id=%s AND profile_id=%s", (eid,pid))
|
||||||
|
return {"ok":True}
|
||||||
282
backend/routers/coupons.py
Normal file
282
backend/routers/coupons.py
Normal file
|
|
@ -0,0 +1,282 @@
|
||||||
|
"""
|
||||||
|
Coupon Management Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
Handles coupon CRUD (admin) and redemption (users).
|
||||||
|
"""
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from typing import Optional
|
||||||
|
from fastapi import APIRouter, HTTPException, Depends
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_auth, require_admin
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/coupons", tags=["coupons"])
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def list_coupons(session: dict = Depends(require_admin)):
|
||||||
|
"""Admin: List all coupons with redemption stats."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT
|
||||||
|
c.*,
|
||||||
|
t.name as tier_name,
|
||||||
|
(SELECT COUNT(*) FROM coupon_redemptions WHERE coupon_id = c.id) as redemptions
|
||||||
|
FROM coupons c
|
||||||
|
LEFT JOIN tiers t ON t.id = c.tier_id
|
||||||
|
ORDER BY c.created DESC
|
||||||
|
""")
|
||||||
|
return [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("")
|
||||||
|
def create_coupon(data: dict, session: dict = Depends(require_admin)):
|
||||||
|
"""
|
||||||
|
Admin: Create new coupon.
|
||||||
|
|
||||||
|
Required fields:
|
||||||
|
- code: Unique coupon code
|
||||||
|
- type: 'single_use', 'period', or 'wellpass'
|
||||||
|
- tier_id: Target tier
|
||||||
|
- duration_days: For period/wellpass coupons
|
||||||
|
|
||||||
|
Optional fields:
|
||||||
|
- max_redemptions: NULL = unlimited
|
||||||
|
- valid_from, valid_until: Validity period
|
||||||
|
- description: Internal note
|
||||||
|
"""
|
||||||
|
code = data.get('code', '').strip().upper()
|
||||||
|
coupon_type = data.get('type')
|
||||||
|
tier_id = data.get('tier_id')
|
||||||
|
duration_days = data.get('duration_days')
|
||||||
|
max_redemptions = data.get('max_redemptions')
|
||||||
|
valid_from = data.get('valid_from')
|
||||||
|
valid_until = data.get('valid_until')
|
||||||
|
description = data.get('description', '')
|
||||||
|
|
||||||
|
if not code:
|
||||||
|
raise HTTPException(400, "Coupon-Code fehlt")
|
||||||
|
if coupon_type not in ['single_use', 'period', 'wellpass']:
|
||||||
|
raise HTTPException(400, "Ungültiger Coupon-Typ")
|
||||||
|
if not tier_id:
|
||||||
|
raise HTTPException(400, "Tier fehlt")
|
||||||
|
if coupon_type in ['period', 'wellpass'] and not duration_days:
|
||||||
|
raise HTTPException(400, "duration_days fehlt für period/wellpass Coupons")
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Check if code already exists
|
||||||
|
cur.execute("SELECT id FROM coupons WHERE code = %s", (code,))
|
||||||
|
if cur.fetchone():
|
||||||
|
raise HTTPException(400, f"Coupon-Code '{code}' existiert bereits")
|
||||||
|
|
||||||
|
# Create coupon
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO coupons (
|
||||||
|
code, type, tier_id, duration_days, max_redemptions,
|
||||||
|
valid_from, valid_until, description, created_by
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||||
|
RETURNING id
|
||||||
|
""", (
|
||||||
|
code, coupon_type, tier_id, duration_days, max_redemptions,
|
||||||
|
valid_from, valid_until, description, session['profile_id']
|
||||||
|
))
|
||||||
|
|
||||||
|
coupon_id = cur.fetchone()['id']
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return {"ok": True, "id": coupon_id, "code": code}
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{coupon_id}")
|
||||||
|
def update_coupon(coupon_id: str, data: dict, session: dict = Depends(require_admin)):
|
||||||
|
"""Admin: Update coupon."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
updates = []
|
||||||
|
values = []
|
||||||
|
|
||||||
|
if 'active' in data:
|
||||||
|
updates.append('active = %s')
|
||||||
|
values.append(data['active'])
|
||||||
|
if 'max_redemptions' in data:
|
||||||
|
updates.append('max_redemptions = %s')
|
||||||
|
values.append(data['max_redemptions'])
|
||||||
|
if 'valid_until' in data:
|
||||||
|
updates.append('valid_until = %s')
|
||||||
|
values.append(data['valid_until'])
|
||||||
|
if 'description' in data:
|
||||||
|
updates.append('description = %s')
|
||||||
|
values.append(data['description'])
|
||||||
|
|
||||||
|
if not updates:
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
updates.append('updated = CURRENT_TIMESTAMP')
|
||||||
|
values.append(coupon_id)
|
||||||
|
|
||||||
|
cur.execute(
|
||||||
|
f"UPDATE coupons SET {', '.join(updates)} WHERE id = %s",
|
||||||
|
values
|
||||||
|
)
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{coupon_id}")
|
||||||
|
def delete_coupon(coupon_id: str, session: dict = Depends(require_admin)):
|
||||||
|
"""Admin: Delete coupon (soft-delete: set active=false)."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("UPDATE coupons SET active = false WHERE id = %s", (coupon_id,))
|
||||||
|
conn.commit()
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{coupon_id}/redemptions")
|
||||||
|
def get_coupon_redemptions(coupon_id: str, session: dict = Depends(require_admin)):
|
||||||
|
"""Admin: Get all redemptions for a coupon."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT
|
||||||
|
cr.id,
|
||||||
|
cr.redeemed_at,
|
||||||
|
p.name as profile_name,
|
||||||
|
p.email as profile_email,
|
||||||
|
ag.valid_from,
|
||||||
|
ag.valid_until,
|
||||||
|
ag.is_active
|
||||||
|
FROM coupon_redemptions cr
|
||||||
|
JOIN profiles p ON p.id = cr.profile_id
|
||||||
|
LEFT JOIN access_grants ag ON ag.id = cr.access_grant_id
|
||||||
|
WHERE cr.coupon_id = %s
|
||||||
|
ORDER BY cr.redeemed_at DESC
|
||||||
|
""", (coupon_id,))
|
||||||
|
return [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/redeem")
|
||||||
|
def redeem_coupon(data: dict, session: dict = Depends(require_auth)):
|
||||||
|
"""
|
||||||
|
User: Redeem a coupon code.
|
||||||
|
|
||||||
|
Creates an access_grant and handles Wellpass pause/resume logic.
|
||||||
|
"""
|
||||||
|
code = data.get('code', '').strip().upper()
|
||||||
|
if not code:
|
||||||
|
raise HTTPException(400, "Coupon-Code fehlt")
|
||||||
|
|
||||||
|
profile_id = session['profile_id']
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Get coupon
|
||||||
|
cur.execute("""
|
||||||
|
SELECT * FROM coupons
|
||||||
|
WHERE code = %s AND active = true
|
||||||
|
""", (code,))
|
||||||
|
coupon = cur.fetchone()
|
||||||
|
|
||||||
|
if not coupon:
|
||||||
|
raise HTTPException(404, "Ungültiger Coupon-Code")
|
||||||
|
|
||||||
|
# Check validity period
|
||||||
|
now = datetime.now()
|
||||||
|
if coupon['valid_from'] and now < coupon['valid_from']:
|
||||||
|
raise HTTPException(400, "Coupon noch nicht gültig")
|
||||||
|
if coupon['valid_until'] and now > coupon['valid_until']:
|
||||||
|
raise HTTPException(400, "Coupon abgelaufen")
|
||||||
|
|
||||||
|
# Check max redemptions
|
||||||
|
if coupon['max_redemptions'] is not None:
|
||||||
|
if coupon['redemption_count'] >= coupon['max_redemptions']:
|
||||||
|
raise HTTPException(400, "Coupon bereits vollständig eingelöst")
|
||||||
|
|
||||||
|
# Check if user already redeemed this coupon
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id FROM coupon_redemptions
|
||||||
|
WHERE coupon_id = %s AND profile_id = %s
|
||||||
|
""", (coupon['id'], profile_id))
|
||||||
|
if cur.fetchone():
|
||||||
|
raise HTTPException(400, "Du hast diesen Coupon bereits eingelöst")
|
||||||
|
|
||||||
|
# Create access grant
|
||||||
|
valid_from = now
|
||||||
|
valid_until = now + timedelta(days=coupon['duration_days']) if coupon['duration_days'] else None
|
||||||
|
|
||||||
|
# Wellpass logic: Pause existing personal grants
|
||||||
|
if coupon['type'] == 'wellpass':
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id, valid_until
|
||||||
|
FROM access_grants
|
||||||
|
WHERE profile_id = %s
|
||||||
|
AND is_active = true
|
||||||
|
AND granted_by != 'wellpass'
|
||||||
|
AND valid_until > CURRENT_TIMESTAMP
|
||||||
|
""", (profile_id,))
|
||||||
|
active_grants = cur.fetchall()
|
||||||
|
|
||||||
|
for grant in active_grants:
|
||||||
|
# Calculate remaining days
|
||||||
|
remaining = (grant['valid_until'] - now).days
|
||||||
|
# Pause grant
|
||||||
|
cur.execute("""
|
||||||
|
UPDATE access_grants
|
||||||
|
SET is_active = false,
|
||||||
|
paused_at = CURRENT_TIMESTAMP,
|
||||||
|
remaining_days = %s
|
||||||
|
WHERE id = %s
|
||||||
|
""", (remaining, grant['id']))
|
||||||
|
|
||||||
|
# Insert access grant
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO access_grants (
|
||||||
|
profile_id, tier_id, granted_by, coupon_id,
|
||||||
|
valid_from, valid_until, is_active
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, true)
|
||||||
|
RETURNING id
|
||||||
|
""", (
|
||||||
|
profile_id, coupon['tier_id'],
|
||||||
|
coupon['type'], coupon['id'],
|
||||||
|
valid_from, valid_until
|
||||||
|
))
|
||||||
|
grant_id = cur.fetchone()['id']
|
||||||
|
|
||||||
|
# Record redemption
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO coupon_redemptions (coupon_id, profile_id, access_grant_id)
|
||||||
|
VALUES (%s, %s, %s)
|
||||||
|
""", (coupon['id'], profile_id, grant_id))
|
||||||
|
|
||||||
|
# Increment coupon redemption count
|
||||||
|
cur.execute("""
|
||||||
|
UPDATE coupons
|
||||||
|
SET redemption_count = redemption_count + 1
|
||||||
|
WHERE id = %s
|
||||||
|
""", (coupon['id'],))
|
||||||
|
|
||||||
|
# Log activity
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO user_activity_log (profile_id, action, details)
|
||||||
|
VALUES (%s, 'coupon_redeemed', %s)
|
||||||
|
""", (
|
||||||
|
profile_id,
|
||||||
|
f'{{"coupon_code": "{code}", "tier": "{coupon["tier_id"]}", "duration_days": {coupon["duration_days"]}}}'
|
||||||
|
))
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"ok": True,
|
||||||
|
"message": f"Coupon erfolgreich eingelöst: {coupon['tier_id']} für {coupon['duration_days']} Tage",
|
||||||
|
"grant_id": grant_id,
|
||||||
|
"valid_until": valid_until.isoformat() if valid_until else None
|
||||||
|
}
|
||||||
146
backend/routers/evaluation.py
Normal file
146
backend/routers/evaluation.py
Normal file
|
|
@ -0,0 +1,146 @@
|
||||||
|
"""
|
||||||
|
Evaluation Endpoints - Training Type Profiles
|
||||||
|
Endpoints for activity evaluation and re-evaluation.
|
||||||
|
|
||||||
|
Issue: #15
|
||||||
|
Date: 2026-03-23
|
||||||
|
"""
|
||||||
|
import logging
|
||||||
|
from typing import Optional
|
||||||
|
from fastapi import APIRouter, HTTPException, Depends
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_auth, require_admin
|
||||||
|
from evaluation_helper import (
|
||||||
|
evaluate_and_save_activity,
|
||||||
|
batch_evaluate_activities,
|
||||||
|
load_parameters_registry
|
||||||
|
)
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/evaluation", tags=["evaluation"])
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/parameters")
|
||||||
|
def list_parameters(session: dict = Depends(require_auth)):
|
||||||
|
"""
|
||||||
|
List all available training parameters.
|
||||||
|
"""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
parameters = load_parameters_registry(cur)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"parameters": list(parameters.values()),
|
||||||
|
"count": len(parameters)
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/activity/{activity_id}")
|
||||||
|
def evaluate_activity(
|
||||||
|
activity_id: str,
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Evaluates or re-evaluates a single activity.
|
||||||
|
|
||||||
|
Returns the evaluation result.
|
||||||
|
"""
|
||||||
|
profile_id = session['profile_id']
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Load activity
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id, profile_id, date, training_type_id, duration_min,
|
||||||
|
hr_avg, hr_max, distance_km, kcal_active, kcal_resting,
|
||||||
|
rpe, pace_min_per_km, cadence, elevation_gain
|
||||||
|
FROM activity_log
|
||||||
|
WHERE id = %s AND profile_id = %s
|
||||||
|
""", (activity_id, profile_id))
|
||||||
|
|
||||||
|
activity = cur.fetchone()
|
||||||
|
if not activity:
|
||||||
|
raise HTTPException(404, "Activity not found")
|
||||||
|
|
||||||
|
activity_dict = dict(activity)
|
||||||
|
|
||||||
|
# Evaluate
|
||||||
|
result = evaluate_and_save_activity(
|
||||||
|
cur,
|
||||||
|
activity_dict["id"],
|
||||||
|
activity_dict,
|
||||||
|
activity_dict["training_type_id"],
|
||||||
|
profile_id
|
||||||
|
)
|
||||||
|
|
||||||
|
if not result:
|
||||||
|
return {
|
||||||
|
"message": "No profile configured for this training type",
|
||||||
|
"evaluation": None
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
"message": "Activity evaluated",
|
||||||
|
"evaluation": result
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/batch")
|
||||||
|
def batch_evaluate(
|
||||||
|
limit: Optional[int] = None,
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Re-evaluates all activities for the current user.
|
||||||
|
|
||||||
|
Optional limit parameter for testing.
|
||||||
|
"""
|
||||||
|
profile_id = session['profile_id']
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
stats = batch_evaluate_activities(cur, profile_id, limit)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"message": "Batch evaluation completed",
|
||||||
|
"stats": stats
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/batch/all")
|
||||||
|
def batch_evaluate_all(session: dict = Depends(require_admin)):
|
||||||
|
"""
|
||||||
|
Admin-only: Re-evaluates all activities for all users.
|
||||||
|
|
||||||
|
Use with caution on large databases!
|
||||||
|
"""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Get all profiles
|
||||||
|
cur.execute("SELECT id FROM profiles")
|
||||||
|
profiles = cur.fetchall()
|
||||||
|
|
||||||
|
total_stats = {
|
||||||
|
"profiles": len(profiles),
|
||||||
|
"total": 0,
|
||||||
|
"evaluated": 0,
|
||||||
|
"skipped": 0,
|
||||||
|
"errors": 0
|
||||||
|
}
|
||||||
|
|
||||||
|
for profile in profiles:
|
||||||
|
profile_id = profile['id']
|
||||||
|
stats = batch_evaluate_activities(cur, profile_id)
|
||||||
|
|
||||||
|
total_stats["total"] += stats["total"]
|
||||||
|
total_stats["evaluated"] += stats["evaluated"]
|
||||||
|
total_stats["skipped"] += stats["skipped"]
|
||||||
|
total_stats["errors"] += stats["errors"]
|
||||||
|
|
||||||
|
return {
|
||||||
|
"message": "Batch evaluation for all users completed",
|
||||||
|
"stats": total_stats
|
||||||
|
}
|
||||||
346
backend/routers/exportdata.py
Normal file
346
backend/routers/exportdata.py
Normal file
|
|
@ -0,0 +1,346 @@
|
||||||
|
"""
|
||||||
|
Data Export Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
Handles CSV, JSON, and ZIP exports with photos.
|
||||||
|
"""
|
||||||
|
import os
|
||||||
|
import csv
|
||||||
|
import io
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
import zipfile
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Optional
|
||||||
|
from datetime import datetime
|
||||||
|
from decimal import Decimal
|
||||||
|
|
||||||
|
from fastapi import APIRouter, HTTPException, Header, Depends
|
||||||
|
from fastapi.responses import StreamingResponse, Response
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_auth, check_feature_access, increment_feature_usage
|
||||||
|
from routers.profiles import get_pid
|
||||||
|
from feature_logger import log_feature_usage
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/export", tags=["export"])
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
PHOTOS_DIR = Path(os.getenv("PHOTOS_DIR", "./photos"))
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/csv")
|
||||||
|
def export_csv(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Export all data as CSV."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
# Phase 4: Check feature access and ENFORCE
|
||||||
|
access = check_feature_access(pid, 'data_export')
|
||||||
|
log_feature_usage(pid, 'data_export', access, 'export_csv')
|
||||||
|
|
||||||
|
if not access['allowed']:
|
||||||
|
logger.warning(
|
||||||
|
f"[FEATURE-LIMIT] User {pid} blocked: "
|
||||||
|
f"data_export {access['reason']} (used: {access['used']}, limit: {access['limit']})"
|
||||||
|
)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=403,
|
||||||
|
detail=f"Limit erreicht: Du hast das Kontingent für Daten-Exporte überschritten ({access['used']}/{access['limit']}). "
|
||||||
|
f"Bitte kontaktiere den Admin oder warte bis zum nächsten Reset."
|
||||||
|
)
|
||||||
|
|
||||||
|
# Build CSV
|
||||||
|
output = io.StringIO()
|
||||||
|
writer = csv.writer(output)
|
||||||
|
|
||||||
|
# Header
|
||||||
|
writer.writerow(["Typ", "Datum", "Wert", "Details"])
|
||||||
|
|
||||||
|
# Weight
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT date, weight, note FROM weight_log WHERE profile_id=%s ORDER BY date", (pid,))
|
||||||
|
for r in cur.fetchall():
|
||||||
|
writer.writerow(["Gewicht", r['date'], f"{float(r['weight'])}kg", r['note'] or ""])
|
||||||
|
|
||||||
|
# Circumferences
|
||||||
|
cur.execute("SELECT date, c_waist, c_belly, c_hip FROM circumference_log WHERE profile_id=%s ORDER BY date", (pid,))
|
||||||
|
for r in cur.fetchall():
|
||||||
|
details = f"Taille:{float(r['c_waist'])}cm Bauch:{float(r['c_belly'])}cm Hüfte:{float(r['c_hip'])}cm"
|
||||||
|
writer.writerow(["Umfänge", r['date'], "", details])
|
||||||
|
|
||||||
|
# Caliper
|
||||||
|
cur.execute("SELECT date, body_fat_pct, lean_mass FROM caliper_log WHERE profile_id=%s ORDER BY date", (pid,))
|
||||||
|
for r in cur.fetchall():
|
||||||
|
writer.writerow(["Caliper", r['date'], f"{float(r['body_fat_pct'])}%", f"Magermasse:{float(r['lean_mass'])}kg"])
|
||||||
|
|
||||||
|
# Nutrition
|
||||||
|
cur.execute("SELECT date, kcal, protein_g FROM nutrition_log WHERE profile_id=%s ORDER BY date", (pid,))
|
||||||
|
for r in cur.fetchall():
|
||||||
|
writer.writerow(["Ernährung", r['date'], f"{float(r['kcal'])}kcal", f"Protein:{float(r['protein_g'])}g"])
|
||||||
|
|
||||||
|
# Activity
|
||||||
|
cur.execute("SELECT date, activity_type, duration_min, kcal_active FROM activity_log WHERE profile_id=%s ORDER BY date", (pid,))
|
||||||
|
for r in cur.fetchall():
|
||||||
|
writer.writerow(["Training", r['date'], r['activity_type'], f"{float(r['duration_min'])}min {float(r['kcal_active'])}kcal"])
|
||||||
|
|
||||||
|
output.seek(0)
|
||||||
|
|
||||||
|
# Phase 2: Increment usage counter
|
||||||
|
increment_feature_usage(pid, 'data_export')
|
||||||
|
|
||||||
|
return StreamingResponse(
|
||||||
|
iter([output.getvalue()]),
|
||||||
|
media_type="text/csv",
|
||||||
|
headers={"Content-Disposition": f"attachment; filename=mitai-export-{pid}.csv"}
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/json")
|
||||||
|
def export_json(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Export all data as JSON."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
# Phase 4: Check feature access and ENFORCE
|
||||||
|
access = check_feature_access(pid, 'data_export')
|
||||||
|
log_feature_usage(pid, 'data_export', access, 'export_json')
|
||||||
|
|
||||||
|
if not access['allowed']:
|
||||||
|
logger.warning(
|
||||||
|
f"[FEATURE-LIMIT] User {pid} blocked: "
|
||||||
|
f"data_export {access['reason']} (used: {access['used']}, limit: {access['limit']})"
|
||||||
|
)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=403,
|
||||||
|
detail=f"Limit erreicht: Du hast das Kontingent für Daten-Exporte überschritten ({access['used']}/{access['limit']}). "
|
||||||
|
f"Bitte kontaktiere den Admin oder warte bis zum nächsten Reset."
|
||||||
|
)
|
||||||
|
|
||||||
|
# Collect all data
|
||||||
|
data = {}
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
cur.execute("SELECT * FROM profiles WHERE id=%s", (pid,))
|
||||||
|
data['profile'] = r2d(cur.fetchone())
|
||||||
|
|
||||||
|
cur.execute("SELECT * FROM weight_log WHERE profile_id=%s ORDER BY date", (pid,))
|
||||||
|
data['weight'] = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
cur.execute("SELECT * FROM circumference_log WHERE profile_id=%s ORDER BY date", (pid,))
|
||||||
|
data['circumferences'] = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
cur.execute("SELECT * FROM caliper_log WHERE profile_id=%s ORDER BY date", (pid,))
|
||||||
|
data['caliper'] = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
cur.execute("SELECT * FROM nutrition_log WHERE profile_id=%s ORDER BY date", (pid,))
|
||||||
|
data['nutrition'] = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
cur.execute("SELECT * FROM activity_log WHERE profile_id=%s ORDER BY date", (pid,))
|
||||||
|
data['activity'] = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
cur.execute("SELECT * FROM ai_insights WHERE profile_id=%s ORDER BY created DESC", (pid,))
|
||||||
|
data['insights'] = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
def decimal_handler(obj):
|
||||||
|
if isinstance(obj, Decimal):
|
||||||
|
return float(obj)
|
||||||
|
return str(obj)
|
||||||
|
|
||||||
|
json_str = json.dumps(data, indent=2, default=decimal_handler)
|
||||||
|
|
||||||
|
# Phase 2: Increment usage counter
|
||||||
|
increment_feature_usage(pid, 'data_export')
|
||||||
|
|
||||||
|
return Response(
|
||||||
|
content=json_str,
|
||||||
|
media_type="application/json",
|
||||||
|
headers={"Content-Disposition": f"attachment; filename=mitai-export-{pid}.json"}
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/zip")
|
||||||
|
def export_zip(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Export all data as ZIP (CSV + JSON + photos) per specification."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
# Phase 4: Check feature access and ENFORCE
|
||||||
|
access = check_feature_access(pid, 'data_export')
|
||||||
|
log_feature_usage(pid, 'data_export', access, 'export_zip')
|
||||||
|
|
||||||
|
if not access['allowed']:
|
||||||
|
logger.warning(
|
||||||
|
f"[FEATURE-LIMIT] User {pid} blocked: "
|
||||||
|
f"data_export {access['reason']} (used: {access['used']}, limit: {access['limit']})"
|
||||||
|
)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=403,
|
||||||
|
detail=f"Limit erreicht: Du hast das Kontingent für Daten-Exporte überschritten ({access['used']}/{access['limit']}). "
|
||||||
|
f"Bitte kontaktiere den Admin oder warte bis zum nächsten Reset."
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get profile
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT * FROM profiles WHERE id=%s", (pid,))
|
||||||
|
prof = r2d(cur.fetchone())
|
||||||
|
|
||||||
|
# Helper: CSV writer with UTF-8 BOM + semicolon
|
||||||
|
def write_csv(zf, filename, rows, columns):
|
||||||
|
if not rows:
|
||||||
|
return
|
||||||
|
output = io.StringIO()
|
||||||
|
writer = csv.writer(output, delimiter=';')
|
||||||
|
writer.writerow(columns)
|
||||||
|
for r in rows:
|
||||||
|
writer.writerow([
|
||||||
|
'' if r.get(col) is None else
|
||||||
|
(float(r[col]) if isinstance(r.get(col), Decimal) else r[col])
|
||||||
|
for col in columns
|
||||||
|
])
|
||||||
|
# UTF-8 with BOM for Excel
|
||||||
|
csv_bytes = '\ufeff'.encode('utf-8') + output.getvalue().encode('utf-8')
|
||||||
|
zf.writestr(f"data/{filename}", csv_bytes)
|
||||||
|
|
||||||
|
# Create ZIP
|
||||||
|
zip_buffer = io.BytesIO()
|
||||||
|
export_date = datetime.now().strftime('%Y-%m-%d')
|
||||||
|
profile_name = prof.get('name', 'export')
|
||||||
|
|
||||||
|
with zipfile.ZipFile(zip_buffer, 'w', zipfile.ZIP_DEFLATED) as zf:
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# 1. README.txt
|
||||||
|
readme = f"""Mitai Jinkendo – Datenexport
|
||||||
|
Version: 2
|
||||||
|
Exportiert am: {export_date}
|
||||||
|
Profil: {profile_name}
|
||||||
|
|
||||||
|
Inhalt:
|
||||||
|
- profile.json: Profildaten und Einstellungen
|
||||||
|
- data/*.csv: Messdaten (Semikolon-getrennt, UTF-8)
|
||||||
|
- insights/: KI-Auswertungen (JSON)
|
||||||
|
- photos/: Progress-Fotos (JPEG)
|
||||||
|
|
||||||
|
Import:
|
||||||
|
Dieser Export kann in Mitai Jinkendo unter
|
||||||
|
Einstellungen → Import → "Mitai Backup importieren"
|
||||||
|
wieder eingespielt werden.
|
||||||
|
|
||||||
|
Format-Version 2 (ab v9b):
|
||||||
|
Alle CSV-Dateien sind UTF-8 mit BOM kodiert.
|
||||||
|
Trennzeichen: Semikolon (;)
|
||||||
|
Datumsformat: YYYY-MM-DD
|
||||||
|
"""
|
||||||
|
zf.writestr("README.txt", readme.encode('utf-8'))
|
||||||
|
|
||||||
|
# 2. profile.json (ohne Passwort-Hash)
|
||||||
|
cur.execute("SELECT COUNT(*) as c FROM weight_log WHERE profile_id=%s", (pid,))
|
||||||
|
w_count = cur.fetchone()['c']
|
||||||
|
cur.execute("SELECT COUNT(*) as c FROM nutrition_log WHERE profile_id=%s", (pid,))
|
||||||
|
n_count = cur.fetchone()['c']
|
||||||
|
cur.execute("SELECT COUNT(*) as c FROM activity_log WHERE profile_id=%s", (pid,))
|
||||||
|
a_count = cur.fetchone()['c']
|
||||||
|
cur.execute("SELECT COUNT(*) as c FROM photos WHERE profile_id=%s", (pid,))
|
||||||
|
p_count = cur.fetchone()['c']
|
||||||
|
|
||||||
|
profile_data = {
|
||||||
|
"export_version": "2",
|
||||||
|
"export_date": export_date,
|
||||||
|
"app": "Mitai Jinkendo",
|
||||||
|
"profile": {
|
||||||
|
"name": prof.get('name'),
|
||||||
|
"email": prof.get('email'),
|
||||||
|
"sex": prof.get('sex'),
|
||||||
|
"height": float(prof['height']) if prof.get('height') else None,
|
||||||
|
"birth_year": prof['dob'].year if prof.get('dob') else None,
|
||||||
|
"goal_weight": float(prof['goal_weight']) if prof.get('goal_weight') else None,
|
||||||
|
"goal_bf_pct": float(prof['goal_bf_pct']) if prof.get('goal_bf_pct') else None,
|
||||||
|
"avatar_color": prof.get('avatar_color'),
|
||||||
|
"auth_type": prof.get('auth_type'),
|
||||||
|
"session_days": prof.get('session_days'),
|
||||||
|
"ai_enabled": prof.get('ai_enabled'),
|
||||||
|
"tier": prof.get('tier')
|
||||||
|
},
|
||||||
|
"stats": {
|
||||||
|
"weight_entries": w_count,
|
||||||
|
"nutrition_entries": n_count,
|
||||||
|
"activity_entries": a_count,
|
||||||
|
"photos": p_count
|
||||||
|
}
|
||||||
|
}
|
||||||
|
zf.writestr("profile.json", json.dumps(profile_data, indent=2, ensure_ascii=False).encode('utf-8'))
|
||||||
|
|
||||||
|
# 3-7. CSV exports (weight, circumferences, caliper, nutrition, activity)
|
||||||
|
cur.execute("SELECT id, date, weight, note, source, created FROM weight_log WHERE profile_id=%s ORDER BY date", (pid,))
|
||||||
|
write_csv(zf, "weight.csv", [r2d(r) for r in cur.fetchall()], ['id','date','weight','note','source','created'])
|
||||||
|
|
||||||
|
cur.execute("SELECT id, date, c_waist, c_hip, c_chest, c_neck, c_arm, c_thigh, c_calf, notes, created FROM circumference_log WHERE profile_id=%s ORDER BY date", (pid,))
|
||||||
|
rows = [r2d(r) for r in cur.fetchall()]
|
||||||
|
for r in rows:
|
||||||
|
r['waist'] = r.pop('c_waist', None); r['hip'] = r.pop('c_hip', None)
|
||||||
|
r['chest'] = r.pop('c_chest', None); r['neck'] = r.pop('c_neck', None)
|
||||||
|
r['upper_arm'] = r.pop('c_arm', None); r['thigh'] = r.pop('c_thigh', None)
|
||||||
|
r['calf'] = r.pop('c_calf', None); r['forearm'] = None; r['note'] = r.pop('notes', None)
|
||||||
|
write_csv(zf, "circumferences.csv", rows, ['id','date','waist','hip','chest','neck','upper_arm','thigh','calf','forearm','note','created'])
|
||||||
|
|
||||||
|
cur.execute("SELECT id, date, sf_chest, sf_abdomen, sf_thigh, sf_triceps, sf_subscap, sf_suprailiac, sf_axilla, sf_method, body_fat_pct, notes, created FROM caliper_log WHERE profile_id=%s ORDER BY date", (pid,))
|
||||||
|
rows = [r2d(r) for r in cur.fetchall()]
|
||||||
|
for r in rows:
|
||||||
|
r['chest'] = r.pop('sf_chest', None); r['abdomen'] = r.pop('sf_abdomen', None)
|
||||||
|
r['thigh'] = r.pop('sf_thigh', None); r['tricep'] = r.pop('sf_triceps', None)
|
||||||
|
r['subscapular'] = r.pop('sf_subscap', None); r['suprailiac'] = r.pop('sf_suprailiac', None)
|
||||||
|
r['midaxillary'] = r.pop('sf_axilla', None); r['method'] = r.pop('sf_method', None)
|
||||||
|
r['bf_percent'] = r.pop('body_fat_pct', None); r['note'] = r.pop('notes', None)
|
||||||
|
write_csv(zf, "caliper.csv", rows, ['id','date','chest','abdomen','thigh','tricep','subscapular','suprailiac','midaxillary','method','bf_percent','note','created'])
|
||||||
|
|
||||||
|
cur.execute("SELECT id, date, kcal, protein_g, fat_g, carbs_g, source, created FROM nutrition_log WHERE profile_id=%s ORDER BY date", (pid,))
|
||||||
|
rows = [r2d(r) for r in cur.fetchall()]
|
||||||
|
for r in rows:
|
||||||
|
r['meal_name'] = ''; r['protein'] = r.pop('protein_g', None)
|
||||||
|
r['fat'] = r.pop('fat_g', None); r['carbs'] = r.pop('carbs_g', None)
|
||||||
|
r['fiber'] = None; r['note'] = ''
|
||||||
|
write_csv(zf, "nutrition.csv", rows, ['id','date','meal_name','kcal','protein','fat','carbs','fiber','note','source','created'])
|
||||||
|
|
||||||
|
cur.execute("SELECT id, date, activity_type, duration_min, kcal_active, hr_avg, hr_max, distance_km, notes, source, created FROM activity_log WHERE profile_id=%s ORDER BY date", (pid,))
|
||||||
|
rows = [r2d(r) for r in cur.fetchall()]
|
||||||
|
for r in rows:
|
||||||
|
r['name'] = r['activity_type']; r['type'] = r.pop('activity_type', None)
|
||||||
|
r['kcal'] = r.pop('kcal_active', None); r['heart_rate_avg'] = r.pop('hr_avg', None)
|
||||||
|
r['heart_rate_max'] = r.pop('hr_max', None); r['note'] = r.pop('notes', None)
|
||||||
|
write_csv(zf, "activity.csv", rows, ['id','date','name','type','duration_min','kcal','heart_rate_avg','heart_rate_max','distance_km','note','source','created'])
|
||||||
|
|
||||||
|
# 8. insights/ai_insights.json
|
||||||
|
cur.execute("SELECT id, scope, content, created FROM ai_insights WHERE profile_id=%s ORDER BY created DESC", (pid,))
|
||||||
|
insights = []
|
||||||
|
for r in cur.fetchall():
|
||||||
|
rd = r2d(r)
|
||||||
|
insights.append({
|
||||||
|
"id": rd['id'],
|
||||||
|
"scope": rd['scope'],
|
||||||
|
"created": rd['created'].isoformat() if hasattr(rd['created'], 'isoformat') else str(rd['created']),
|
||||||
|
"result": rd['content']
|
||||||
|
})
|
||||||
|
if insights:
|
||||||
|
zf.writestr("insights/ai_insights.json", json.dumps(insights, indent=2, ensure_ascii=False).encode('utf-8'))
|
||||||
|
|
||||||
|
# 9. photos/
|
||||||
|
cur.execute("SELECT * FROM photos WHERE profile_id=%s ORDER BY date", (pid,))
|
||||||
|
photos = [r2d(r) for r in cur.fetchall()]
|
||||||
|
for i, photo in enumerate(photos):
|
||||||
|
photo_path = Path(PHOTOS_DIR) / photo['path']
|
||||||
|
if photo_path.exists():
|
||||||
|
filename = f"{photo.get('date') or export_date}_{i+1}{photo_path.suffix}"
|
||||||
|
zf.write(photo_path, f"photos/{filename}")
|
||||||
|
|
||||||
|
zip_buffer.seek(0)
|
||||||
|
filename = f"mitai-export-{profile_name.replace(' ','-')}-{export_date}.zip"
|
||||||
|
|
||||||
|
# Phase 2: Increment usage counter
|
||||||
|
increment_feature_usage(pid, 'data_export')
|
||||||
|
|
||||||
|
return StreamingResponse(
|
||||||
|
iter([zip_buffer.getvalue()]),
|
||||||
|
media_type="application/zip",
|
||||||
|
headers={"Content-Disposition": f"attachment; filename={filename}"}
|
||||||
|
)
|
||||||
223
backend/routers/features.py
Normal file
223
backend/routers/features.py
Normal file
|
|
@ -0,0 +1,223 @@
|
||||||
|
"""
|
||||||
|
Feature Management Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
Admin-only CRUD for features registry.
|
||||||
|
User endpoint for feature usage overview (Phase 3).
|
||||||
|
"""
|
||||||
|
from typing import Optional
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
from fastapi import APIRouter, HTTPException, Header, Depends
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_admin, require_auth, check_feature_access
|
||||||
|
from routers.profiles import get_pid
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/features", tags=["features"])
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def list_features(session: dict = Depends(require_admin)):
|
||||||
|
"""Admin: List all features."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT * FROM features
|
||||||
|
ORDER BY category, name
|
||||||
|
""")
|
||||||
|
return [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("")
|
||||||
|
def create_feature(data: dict, session: dict = Depends(require_admin)):
|
||||||
|
"""
|
||||||
|
Admin: Create new feature.
|
||||||
|
|
||||||
|
Required fields:
|
||||||
|
- id: Feature ID (e.g., 'new_data_source')
|
||||||
|
- name: Display name
|
||||||
|
- category: 'data', 'ai', 'export', 'integration'
|
||||||
|
- limit_type: 'count' or 'boolean'
|
||||||
|
- reset_period: 'never', 'daily', 'monthly'
|
||||||
|
- default_limit: INT or NULL (unlimited)
|
||||||
|
"""
|
||||||
|
feature_id = data.get('id', '').strip()
|
||||||
|
name = data.get('name', '').strip()
|
||||||
|
description = data.get('description', '')
|
||||||
|
category = data.get('category')
|
||||||
|
limit_type = data.get('limit_type', 'count')
|
||||||
|
reset_period = data.get('reset_period', 'never')
|
||||||
|
default_limit = data.get('default_limit')
|
||||||
|
|
||||||
|
if not feature_id or not name:
|
||||||
|
raise HTTPException(400, "ID und Name fehlen")
|
||||||
|
if category not in ['data', 'ai', 'export', 'integration']:
|
||||||
|
raise HTTPException(400, "Ungültige Kategorie")
|
||||||
|
if limit_type not in ['count', 'boolean']:
|
||||||
|
raise HTTPException(400, "limit_type muss 'count' oder 'boolean' sein")
|
||||||
|
if reset_period not in ['never', 'daily', 'monthly']:
|
||||||
|
raise HTTPException(400, "Ungültiger reset_period")
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Check if ID already exists
|
||||||
|
cur.execute("SELECT id FROM features WHERE id = %s", (feature_id,))
|
||||||
|
if cur.fetchone():
|
||||||
|
raise HTTPException(400, f"Feature '{feature_id}' existiert bereits")
|
||||||
|
|
||||||
|
# Create feature
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO features (
|
||||||
|
id, name, description, category, limit_type, reset_period, default_limit
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s)
|
||||||
|
""", (feature_id, name, description, category, limit_type, reset_period, default_limit))
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return {"ok": True, "id": feature_id}
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{feature_id}")
|
||||||
|
def update_feature(feature_id: str, data: dict, session: dict = Depends(require_admin)):
|
||||||
|
"""Admin: Update feature."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
updates = []
|
||||||
|
values = []
|
||||||
|
|
||||||
|
if 'name' in data:
|
||||||
|
updates.append('name = %s')
|
||||||
|
values.append(data['name'])
|
||||||
|
if 'description' in data:
|
||||||
|
updates.append('description = %s')
|
||||||
|
values.append(data['description'])
|
||||||
|
if 'default_limit' in data:
|
||||||
|
updates.append('default_limit = %s')
|
||||||
|
values.append(data['default_limit'])
|
||||||
|
if 'active' in data:
|
||||||
|
updates.append('active = %s')
|
||||||
|
values.append(data['active'])
|
||||||
|
|
||||||
|
if not updates:
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
updates.append('updated = CURRENT_TIMESTAMP')
|
||||||
|
values.append(feature_id)
|
||||||
|
|
||||||
|
cur.execute(
|
||||||
|
f"UPDATE features SET {', '.join(updates)} WHERE id = %s",
|
||||||
|
values
|
||||||
|
)
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{feature_id}")
|
||||||
|
def delete_feature(feature_id: str, session: dict = Depends(require_admin)):
|
||||||
|
"""Admin: Delete feature (soft-delete: set active=false)."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("UPDATE features SET active = false WHERE id = %s", (feature_id,))
|
||||||
|
conn.commit()
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{feature_id}/check-access")
|
||||||
|
def check_access(feature_id: str, session: dict = Depends(require_auth)):
|
||||||
|
"""
|
||||||
|
User: Check if current user can access a feature.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
- allowed: bool - whether user can use the feature
|
||||||
|
- limit: int|null - total limit (null = unlimited)
|
||||||
|
- used: int - current usage
|
||||||
|
- remaining: int|null - remaining uses (null = unlimited)
|
||||||
|
- reason: str - why access is granted/denied
|
||||||
|
"""
|
||||||
|
profile_id = session['profile_id']
|
||||||
|
result = check_feature_access(profile_id, feature_id)
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/usage")
|
||||||
|
def get_feature_usage(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""
|
||||||
|
User: Get usage overview for all active features (Phase 3: Frontend Display).
|
||||||
|
|
||||||
|
Returns list of all features with current usage, limits, and reset info.
|
||||||
|
Automatically includes new features from database - no code changes needed.
|
||||||
|
|
||||||
|
Response:
|
||||||
|
[
|
||||||
|
{
|
||||||
|
"feature_id": "weight_entries",
|
||||||
|
"name": "Gewichtseinträge",
|
||||||
|
"description": "Anzahl der Gewichtseinträge",
|
||||||
|
"category": "data",
|
||||||
|
"limit_type": "count",
|
||||||
|
"reset_period": "never",
|
||||||
|
"used": 5,
|
||||||
|
"limit": 10,
|
||||||
|
"remaining": 5,
|
||||||
|
"allowed": true,
|
||||||
|
"reset_at": null
|
||||||
|
},
|
||||||
|
...
|
||||||
|
]
|
||||||
|
"""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Get all active features (dynamic - picks up new features automatically)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id, name, description, category, limit_type, reset_period
|
||||||
|
FROM features
|
||||||
|
WHERE active = true
|
||||||
|
ORDER BY category, name
|
||||||
|
""")
|
||||||
|
features = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
result = []
|
||||||
|
for feature in features:
|
||||||
|
# Use existing check_feature_access to get usage and limits
|
||||||
|
# This respects user overrides, tier limits, and feature defaults
|
||||||
|
# Pass connection to avoid pool exhaustion
|
||||||
|
access = check_feature_access(pid, feature['id'], conn)
|
||||||
|
|
||||||
|
# Get reset date from user_feature_usage
|
||||||
|
cur.execute("""
|
||||||
|
SELECT reset_at
|
||||||
|
FROM user_feature_usage
|
||||||
|
WHERE profile_id = %s AND feature_id = %s
|
||||||
|
""", (pid, feature['id']))
|
||||||
|
usage_row = cur.fetchone()
|
||||||
|
|
||||||
|
# Format reset_at as ISO string
|
||||||
|
reset_at = None
|
||||||
|
if usage_row and usage_row['reset_at']:
|
||||||
|
if isinstance(usage_row['reset_at'], datetime):
|
||||||
|
reset_at = usage_row['reset_at'].isoformat()
|
||||||
|
else:
|
||||||
|
reset_at = str(usage_row['reset_at'])
|
||||||
|
|
||||||
|
result.append({
|
||||||
|
'feature_id': feature['id'],
|
||||||
|
'name': feature['name'],
|
||||||
|
'description': feature.get('description'),
|
||||||
|
'category': feature.get('category'),
|
||||||
|
'limit_type': feature['limit_type'],
|
||||||
|
'reset_period': feature['reset_period'],
|
||||||
|
'used': access['used'],
|
||||||
|
'limit': access['limit'],
|
||||||
|
'remaining': access['remaining'],
|
||||||
|
'allowed': access['allowed'],
|
||||||
|
'reset_at': reset_at
|
||||||
|
})
|
||||||
|
|
||||||
|
return result
|
||||||
378
backend/routers/focus_areas.py
Normal file
378
backend/routers/focus_areas.py
Normal file
|
|
@ -0,0 +1,378 @@
|
||||||
|
"""
|
||||||
|
Focus Areas Router
|
||||||
|
Manages dynamic focus area definitions and user preferences
|
||||||
|
"""
|
||||||
|
from fastapi import APIRouter, HTTPException, Depends
|
||||||
|
from pydantic import BaseModel
|
||||||
|
from typing import Optional, List
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_auth
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/focus-areas", tags=["focus-areas"])
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Models
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
class FocusAreaCreate(BaseModel):
|
||||||
|
"""Create new focus area definition"""
|
||||||
|
key: str
|
||||||
|
name_de: str
|
||||||
|
name_en: Optional[str] = None
|
||||||
|
icon: Optional[str] = None
|
||||||
|
description: Optional[str] = None
|
||||||
|
category: str = 'custom'
|
||||||
|
|
||||||
|
class FocusAreaUpdate(BaseModel):
|
||||||
|
"""Update focus area definition"""
|
||||||
|
name_de: Optional[str] = None
|
||||||
|
name_en: Optional[str] = None
|
||||||
|
icon: Optional[str] = None
|
||||||
|
description: Optional[str] = None
|
||||||
|
category: Optional[str] = None
|
||||||
|
is_active: Optional[bool] = None
|
||||||
|
|
||||||
|
class UserFocusPreferences(BaseModel):
|
||||||
|
"""User's focus area weightings (dynamic)"""
|
||||||
|
preferences: dict # {focus_area_id: weight_pct}
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Focus Area Definitions (Admin)
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.get("/definitions")
|
||||||
|
def list_focus_area_definitions(
|
||||||
|
session: dict = Depends(require_auth),
|
||||||
|
include_inactive: bool = False
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
List all available focus area definitions.
|
||||||
|
|
||||||
|
Query params:
|
||||||
|
- include_inactive: Include inactive focus areas (default: false)
|
||||||
|
|
||||||
|
Returns focus areas grouped by category.
|
||||||
|
"""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
query = """
|
||||||
|
SELECT id, key, name_de, name_en, icon, description, category, is_active,
|
||||||
|
created_at, updated_at
|
||||||
|
FROM focus_area_definitions
|
||||||
|
WHERE is_active = true OR %s
|
||||||
|
ORDER BY category, name_de
|
||||||
|
"""
|
||||||
|
|
||||||
|
cur.execute(query, (include_inactive,))
|
||||||
|
areas = [r2d(row) for row in cur.fetchall()]
|
||||||
|
|
||||||
|
# Group by category
|
||||||
|
grouped = {}
|
||||||
|
for area in areas:
|
||||||
|
cat = area['category'] or 'other'
|
||||||
|
if cat not in grouped:
|
||||||
|
grouped[cat] = []
|
||||||
|
grouped[cat].append(area)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"areas": areas,
|
||||||
|
"grouped": grouped,
|
||||||
|
"total": len(areas)
|
||||||
|
}
|
||||||
|
|
||||||
|
@router.post("/definitions")
|
||||||
|
def create_focus_area_definition(
|
||||||
|
data: FocusAreaCreate,
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Create new focus area definition (Admin only).
|
||||||
|
|
||||||
|
Note: Requires admin role.
|
||||||
|
"""
|
||||||
|
# Admin check
|
||||||
|
if session.get('role') != 'admin':
|
||||||
|
raise HTTPException(status_code=403, detail="Admin-Rechte erforderlich")
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Check if key already exists
|
||||||
|
cur.execute(
|
||||||
|
"SELECT id FROM focus_area_definitions WHERE key = %s",
|
||||||
|
(data.key,)
|
||||||
|
)
|
||||||
|
if cur.fetchone():
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=400,
|
||||||
|
detail=f"Focus Area mit Key '{data.key}' existiert bereits"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Insert
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO focus_area_definitions
|
||||||
|
(key, name_de, name_en, icon, description, category)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s)
|
||||||
|
RETURNING id
|
||||||
|
""", (
|
||||||
|
data.key, data.name_de, data.name_en,
|
||||||
|
data.icon, data.description, data.category
|
||||||
|
))
|
||||||
|
|
||||||
|
area_id = cur.fetchone()['id']
|
||||||
|
|
||||||
|
return {
|
||||||
|
"id": area_id,
|
||||||
|
"message": f"Focus Area '{data.name_de}' erstellt"
|
||||||
|
}
|
||||||
|
|
||||||
|
@router.put("/definitions/{area_id}")
|
||||||
|
def update_focus_area_definition(
|
||||||
|
area_id: str,
|
||||||
|
data: FocusAreaUpdate,
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Update focus area definition (Admin only)"""
|
||||||
|
# Admin check
|
||||||
|
if session.get('role') != 'admin':
|
||||||
|
raise HTTPException(status_code=403, detail="Admin-Rechte erforderlich")
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Build dynamic UPDATE
|
||||||
|
updates = []
|
||||||
|
values = []
|
||||||
|
|
||||||
|
if data.name_de is not None:
|
||||||
|
updates.append("name_de = %s")
|
||||||
|
values.append(data.name_de)
|
||||||
|
if data.name_en is not None:
|
||||||
|
updates.append("name_en = %s")
|
||||||
|
values.append(data.name_en)
|
||||||
|
if data.icon is not None:
|
||||||
|
updates.append("icon = %s")
|
||||||
|
values.append(data.icon)
|
||||||
|
if data.description is not None:
|
||||||
|
updates.append("description = %s")
|
||||||
|
values.append(data.description)
|
||||||
|
if data.category is not None:
|
||||||
|
updates.append("category = %s")
|
||||||
|
values.append(data.category)
|
||||||
|
if data.is_active is not None:
|
||||||
|
updates.append("is_active = %s")
|
||||||
|
values.append(data.is_active)
|
||||||
|
|
||||||
|
if not updates:
|
||||||
|
raise HTTPException(status_code=400, detail="Keine Änderungen angegeben")
|
||||||
|
|
||||||
|
updates.append("updated_at = NOW()")
|
||||||
|
values.append(area_id)
|
||||||
|
|
||||||
|
query = f"""
|
||||||
|
UPDATE focus_area_definitions
|
||||||
|
SET {', '.join(updates)}
|
||||||
|
WHERE id = %s
|
||||||
|
RETURNING id
|
||||||
|
"""
|
||||||
|
|
||||||
|
cur.execute(query, values)
|
||||||
|
|
||||||
|
if not cur.fetchone():
|
||||||
|
raise HTTPException(status_code=404, detail="Focus Area nicht gefunden")
|
||||||
|
|
||||||
|
return {"message": "Focus Area aktualisiert"}
|
||||||
|
|
||||||
|
@router.delete("/definitions/{area_id}")
|
||||||
|
def delete_focus_area_definition(
|
||||||
|
area_id: str,
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Delete focus area definition (Admin only).
|
||||||
|
|
||||||
|
Cascades: Deletes all goal_focus_contributions referencing this area.
|
||||||
|
"""
|
||||||
|
# Admin check
|
||||||
|
if session.get('role') != 'admin':
|
||||||
|
raise HTTPException(status_code=403, detail="Admin-Rechte erforderlich")
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Check if area is used
|
||||||
|
cur.execute(
|
||||||
|
"SELECT COUNT(*) as count FROM goal_focus_contributions WHERE focus_area_id = %s",
|
||||||
|
(area_id,)
|
||||||
|
)
|
||||||
|
count = cur.fetchone()['count']
|
||||||
|
|
||||||
|
if count > 0:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=400,
|
||||||
|
detail=f"Focus Area wird von {count} Ziel(en) verwendet. "
|
||||||
|
"Bitte erst Zuordnungen entfernen oder auf 'inaktiv' setzen."
|
||||||
|
)
|
||||||
|
|
||||||
|
# Delete
|
||||||
|
cur.execute(
|
||||||
|
"DELETE FROM focus_area_definitions WHERE id = %s RETURNING id",
|
||||||
|
(area_id,)
|
||||||
|
)
|
||||||
|
|
||||||
|
if not cur.fetchone():
|
||||||
|
raise HTTPException(status_code=404, detail="Focus Area nicht gefunden")
|
||||||
|
|
||||||
|
return {"message": "Focus Area gelöscht"}
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# User Focus Preferences
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.get("/user-preferences")
|
||||||
|
def get_user_focus_preferences(session: dict = Depends(require_auth)):
|
||||||
|
"""
|
||||||
|
Get user's focus area weightings (dynamic system).
|
||||||
|
|
||||||
|
Returns focus areas with user-set weights, grouped by category.
|
||||||
|
"""
|
||||||
|
pid = session['profile_id']
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Get dynamic preferences (Migration 032)
|
||||||
|
try:
|
||||||
|
cur.execute("""
|
||||||
|
SELECT
|
||||||
|
fa.id, fa.key, fa.name_de, fa.name_en, fa.icon,
|
||||||
|
fa.category, fa.description,
|
||||||
|
ufw.weight
|
||||||
|
FROM user_focus_area_weights ufw
|
||||||
|
JOIN focus_area_definitions fa ON ufw.focus_area_id = fa.id
|
||||||
|
WHERE ufw.profile_id = %s AND ufw.weight > 0
|
||||||
|
ORDER BY fa.category, fa.name_de
|
||||||
|
""", (pid,))
|
||||||
|
|
||||||
|
weights = [r2d(row) for row in cur.fetchall()]
|
||||||
|
|
||||||
|
# Calculate percentages from weights
|
||||||
|
total_weight = sum(w['weight'] for w in weights)
|
||||||
|
if total_weight > 0:
|
||||||
|
for w in weights:
|
||||||
|
w['percentage'] = round((w['weight'] / total_weight) * 100)
|
||||||
|
else:
|
||||||
|
for w in weights:
|
||||||
|
w['percentage'] = 0
|
||||||
|
|
||||||
|
# Group by category
|
||||||
|
grouped = {}
|
||||||
|
for w in weights:
|
||||||
|
cat = w['category'] or 'other'
|
||||||
|
if cat not in grouped:
|
||||||
|
grouped[cat] = []
|
||||||
|
grouped[cat].append(w)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"weights": weights,
|
||||||
|
"grouped": grouped,
|
||||||
|
"total_weight": total_weight
|
||||||
|
}
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
# Migration 032 not applied yet - return empty
|
||||||
|
print(f"[WARNING] user_focus_area_weights not found: {e}")
|
||||||
|
return {
|
||||||
|
"weights": [],
|
||||||
|
"grouped": {},
|
||||||
|
"total_weight": 0
|
||||||
|
}
|
||||||
|
|
||||||
|
@router.put("/user-preferences")
|
||||||
|
def update_user_focus_preferences(
|
||||||
|
data: dict,
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Update user's focus area weightings (dynamic system).
|
||||||
|
|
||||||
|
Expects: { "weights": { "focus_area_id": weight, ... } }
|
||||||
|
Weights are relative (0-100), normalized in display only.
|
||||||
|
"""
|
||||||
|
pid = session['profile_id']
|
||||||
|
|
||||||
|
if 'weights' not in data:
|
||||||
|
raise HTTPException(status_code=400, detail="'weights' field required")
|
||||||
|
|
||||||
|
weights = data['weights'] # Dict: focus_area_id → weight
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Delete existing weights
|
||||||
|
cur.execute(
|
||||||
|
"DELETE FROM user_focus_area_weights WHERE profile_id = %s",
|
||||||
|
(pid,)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Insert new weights (only non-zero)
|
||||||
|
for focus_area_id, weight in weights.items():
|
||||||
|
weight_int = int(weight)
|
||||||
|
if weight_int > 0:
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO user_focus_area_weights
|
||||||
|
(profile_id, focus_area_id, weight)
|
||||||
|
VALUES (%s, %s, %s)
|
||||||
|
ON CONFLICT (profile_id, focus_area_id)
|
||||||
|
DO UPDATE SET
|
||||||
|
weight = EXCLUDED.weight,
|
||||||
|
updated_at = NOW()
|
||||||
|
""", (pid, focus_area_id, weight_int))
|
||||||
|
|
||||||
|
return {
|
||||||
|
"message": "Focus Area Gewichtungen aktualisiert",
|
||||||
|
"count": len([w for w in weights.values() if int(w) > 0])
|
||||||
|
}
|
||||||
|
|
||||||
|
# ============================================================================
|
||||||
|
# Stats & Analytics
|
||||||
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.get("/stats")
|
||||||
|
def get_focus_area_stats(session: dict = Depends(require_auth)):
|
||||||
|
"""
|
||||||
|
Get focus area statistics for current user.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
- Progress per focus area (avg of all contributing goals)
|
||||||
|
- Goal count per focus area
|
||||||
|
- Top/bottom performing areas
|
||||||
|
"""
|
||||||
|
pid = session['profile_id']
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
cur.execute("""
|
||||||
|
SELECT
|
||||||
|
fa.id, fa.key, fa.name_de, fa.icon, fa.category,
|
||||||
|
COUNT(DISTINCT gfc.goal_id) as goal_count,
|
||||||
|
AVG(g.progress_pct) as avg_progress,
|
||||||
|
SUM(gfc.contribution_weight) as total_contribution
|
||||||
|
FROM focus_area_definitions fa
|
||||||
|
LEFT JOIN goal_focus_contributions gfc ON fa.id = gfc.focus_area_id
|
||||||
|
LEFT JOIN goals g ON gfc.goal_id = g.id AND g.profile_id = %s
|
||||||
|
WHERE fa.is_active = true
|
||||||
|
GROUP BY fa.id
|
||||||
|
HAVING COUNT(DISTINCT gfc.goal_id) > 0 -- Only areas with goals
|
||||||
|
ORDER BY avg_progress DESC NULLS LAST
|
||||||
|
""", (pid,))
|
||||||
|
|
||||||
|
stats = [r2d(row) for row in cur.fetchall()]
|
||||||
|
|
||||||
|
return {
|
||||||
|
"stats": stats,
|
||||||
|
"top_area": stats[0] if stats else None,
|
||||||
|
"bottom_area": stats[-1] if len(stats) > 1 else None
|
||||||
|
}
|
||||||
1339
backend/routers/goals.py
Normal file
1339
backend/routers/goals.py
Normal file
File diff suppressed because it is too large
Load Diff
288
backend/routers/importdata.py
Normal file
288
backend/routers/importdata.py
Normal file
|
|
@ -0,0 +1,288 @@
|
||||||
|
"""
|
||||||
|
Data Import Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
Handles ZIP import with validation and rollback support.
|
||||||
|
"""
|
||||||
|
import os
|
||||||
|
import csv
|
||||||
|
import io
|
||||||
|
import json
|
||||||
|
import uuid
|
||||||
|
import logging
|
||||||
|
import zipfile
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Optional
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
from fastapi import APIRouter, HTTPException, UploadFile, File, Header, Depends
|
||||||
|
|
||||||
|
from db import get_db, get_cursor
|
||||||
|
from auth import require_auth, check_feature_access, increment_feature_usage
|
||||||
|
from routers.profiles import get_pid
|
||||||
|
from feature_logger import log_feature_usage
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/import", tags=["import"])
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
PHOTOS_DIR = Path(os.getenv("PHOTOS_DIR", "./photos"))
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/zip")
|
||||||
|
async def import_zip(
|
||||||
|
file: UploadFile = File(...),
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Import data from ZIP export file.
|
||||||
|
|
||||||
|
- Validates export format
|
||||||
|
- Imports missing entries only (ON CONFLICT DO NOTHING)
|
||||||
|
- Imports photos
|
||||||
|
- Returns import summary
|
||||||
|
- Full rollback on error
|
||||||
|
"""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
# Phase 4: Check feature access and ENFORCE
|
||||||
|
access = check_feature_access(pid, 'data_import')
|
||||||
|
log_feature_usage(pid, 'data_import', access, 'import_zip')
|
||||||
|
|
||||||
|
if not access['allowed']:
|
||||||
|
logger.warning(
|
||||||
|
f"[FEATURE-LIMIT] User {pid} blocked: "
|
||||||
|
f"data_import {access['reason']} (used: {access['used']}, limit: {access['limit']})"
|
||||||
|
)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=403,
|
||||||
|
detail=f"Limit erreicht: Du hast das Kontingent für Daten-Importe überschritten ({access['used']}/{access['limit']}). "
|
||||||
|
f"Bitte kontaktiere den Admin oder warte bis zum nächsten Reset."
|
||||||
|
)
|
||||||
|
|
||||||
|
# Read uploaded file
|
||||||
|
content = await file.read()
|
||||||
|
zip_buffer = io.BytesIO(content)
|
||||||
|
|
||||||
|
try:
|
||||||
|
with zipfile.ZipFile(zip_buffer, 'r') as zf:
|
||||||
|
# 1. Validate profile.json
|
||||||
|
if 'profile.json' not in zf.namelist():
|
||||||
|
raise HTTPException(400, "Ungültiger Export: profile.json fehlt")
|
||||||
|
|
||||||
|
profile_data = json.loads(zf.read('profile.json').decode('utf-8'))
|
||||||
|
export_version = profile_data.get('export_version', '1')
|
||||||
|
|
||||||
|
# Stats tracker
|
||||||
|
stats = {
|
||||||
|
'weight': 0,
|
||||||
|
'circumferences': 0,
|
||||||
|
'caliper': 0,
|
||||||
|
'nutrition': 0,
|
||||||
|
'activity': 0,
|
||||||
|
'photos': 0,
|
||||||
|
'insights': 0
|
||||||
|
}
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
try:
|
||||||
|
# 2. Import weight.csv
|
||||||
|
if 'data/weight.csv' in zf.namelist():
|
||||||
|
csv_data = zf.read('data/weight.csv').decode('utf-8-sig')
|
||||||
|
reader = csv.DictReader(io.StringIO(csv_data), delimiter=';')
|
||||||
|
for row in reader:
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO weight_log (profile_id, date, weight, note, source, created)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s)
|
||||||
|
ON CONFLICT (profile_id, date) DO NOTHING
|
||||||
|
""", (
|
||||||
|
pid,
|
||||||
|
row['date'],
|
||||||
|
float(row['weight']) if row['weight'] else None,
|
||||||
|
row.get('note', ''),
|
||||||
|
row.get('source', 'import'),
|
||||||
|
row.get('created', datetime.now())
|
||||||
|
))
|
||||||
|
if cur.rowcount > 0:
|
||||||
|
stats['weight'] += 1
|
||||||
|
|
||||||
|
# 3. Import circumferences.csv
|
||||||
|
if 'data/circumferences.csv' in zf.namelist():
|
||||||
|
csv_data = zf.read('data/circumferences.csv').decode('utf-8-sig')
|
||||||
|
reader = csv.DictReader(io.StringIO(csv_data), delimiter=';')
|
||||||
|
for row in reader:
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO circumference_log (
|
||||||
|
profile_id, date, c_waist, c_hip, c_chest, c_neck,
|
||||||
|
c_arm, c_thigh, c_calf, notes, created
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||||
|
ON CONFLICT (profile_id, date) DO NOTHING
|
||||||
|
""", (
|
||||||
|
pid,
|
||||||
|
row['date'],
|
||||||
|
float(row['waist']) if row.get('waist') else None,
|
||||||
|
float(row['hip']) if row.get('hip') else None,
|
||||||
|
float(row['chest']) if row.get('chest') else None,
|
||||||
|
float(row['neck']) if row.get('neck') else None,
|
||||||
|
float(row['upper_arm']) if row.get('upper_arm') else None,
|
||||||
|
float(row['thigh']) if row.get('thigh') else None,
|
||||||
|
float(row['calf']) if row.get('calf') else None,
|
||||||
|
row.get('note', ''),
|
||||||
|
row.get('created', datetime.now())
|
||||||
|
))
|
||||||
|
if cur.rowcount > 0:
|
||||||
|
stats['circumferences'] += 1
|
||||||
|
|
||||||
|
# 4. Import caliper.csv
|
||||||
|
if 'data/caliper.csv' in zf.namelist():
|
||||||
|
csv_data = zf.read('data/caliper.csv').decode('utf-8-sig')
|
||||||
|
reader = csv.DictReader(io.StringIO(csv_data), delimiter=';')
|
||||||
|
for row in reader:
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO caliper_log (
|
||||||
|
profile_id, date, sf_chest, sf_abdomen, sf_thigh,
|
||||||
|
sf_triceps, sf_subscap, sf_suprailiac, sf_axilla,
|
||||||
|
sf_method, body_fat_pct, notes, created
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||||
|
ON CONFLICT (profile_id, date) DO NOTHING
|
||||||
|
""", (
|
||||||
|
pid,
|
||||||
|
row['date'],
|
||||||
|
float(row['chest']) if row.get('chest') else None,
|
||||||
|
float(row['abdomen']) if row.get('abdomen') else None,
|
||||||
|
float(row['thigh']) if row.get('thigh') else None,
|
||||||
|
float(row['tricep']) if row.get('tricep') else None,
|
||||||
|
float(row['subscapular']) if row.get('subscapular') else None,
|
||||||
|
float(row['suprailiac']) if row.get('suprailiac') else None,
|
||||||
|
float(row['midaxillary']) if row.get('midaxillary') else None,
|
||||||
|
row.get('method', 'jackson3'),
|
||||||
|
float(row['bf_percent']) if row.get('bf_percent') else None,
|
||||||
|
row.get('note', ''),
|
||||||
|
row.get('created', datetime.now())
|
||||||
|
))
|
||||||
|
if cur.rowcount > 0:
|
||||||
|
stats['caliper'] += 1
|
||||||
|
|
||||||
|
# 5. Import nutrition.csv
|
||||||
|
if 'data/nutrition.csv' in zf.namelist():
|
||||||
|
csv_data = zf.read('data/nutrition.csv').decode('utf-8-sig')
|
||||||
|
reader = csv.DictReader(io.StringIO(csv_data), delimiter=';')
|
||||||
|
for row in reader:
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO nutrition_log (
|
||||||
|
profile_id, date, kcal, protein_g, fat_g, carbs_g, source, created
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s)
|
||||||
|
ON CONFLICT (profile_id, date) DO NOTHING
|
||||||
|
""", (
|
||||||
|
pid,
|
||||||
|
row['date'],
|
||||||
|
float(row['kcal']) if row.get('kcal') else None,
|
||||||
|
float(row['protein']) if row.get('protein') else None,
|
||||||
|
float(row['fat']) if row.get('fat') else None,
|
||||||
|
float(row['carbs']) if row.get('carbs') else None,
|
||||||
|
row.get('source', 'import'),
|
||||||
|
row.get('created', datetime.now())
|
||||||
|
))
|
||||||
|
if cur.rowcount > 0:
|
||||||
|
stats['nutrition'] += 1
|
||||||
|
|
||||||
|
# 6. Import activity.csv
|
||||||
|
if 'data/activity.csv' in zf.namelist():
|
||||||
|
csv_data = zf.read('data/activity.csv').decode('utf-8-sig')
|
||||||
|
reader = csv.DictReader(io.StringIO(csv_data), delimiter=';')
|
||||||
|
for row in reader:
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO activity_log (
|
||||||
|
profile_id, date, activity_type, duration_min,
|
||||||
|
kcal_active, hr_avg, hr_max, distance_km, notes, source, created
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||||
|
""", (
|
||||||
|
pid,
|
||||||
|
row['date'],
|
||||||
|
row.get('type', 'Training'),
|
||||||
|
float(row['duration_min']) if row.get('duration_min') else None,
|
||||||
|
float(row['kcal']) if row.get('kcal') else None,
|
||||||
|
float(row['heart_rate_avg']) if row.get('heart_rate_avg') else None,
|
||||||
|
float(row['heart_rate_max']) if row.get('heart_rate_max') else None,
|
||||||
|
float(row['distance_km']) if row.get('distance_km') else None,
|
||||||
|
row.get('note', ''),
|
||||||
|
row.get('source', 'import'),
|
||||||
|
row.get('created', datetime.now())
|
||||||
|
))
|
||||||
|
if cur.rowcount > 0:
|
||||||
|
stats['activity'] += 1
|
||||||
|
|
||||||
|
# 7. Import ai_insights.json
|
||||||
|
if 'insights/ai_insights.json' in zf.namelist():
|
||||||
|
insights_data = json.loads(zf.read('insights/ai_insights.json').decode('utf-8'))
|
||||||
|
for insight in insights_data:
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO ai_insights (profile_id, scope, content, created)
|
||||||
|
VALUES (%s, %s, %s, %s)
|
||||||
|
""", (
|
||||||
|
pid,
|
||||||
|
insight['scope'],
|
||||||
|
insight['result'],
|
||||||
|
insight.get('created', datetime.now())
|
||||||
|
))
|
||||||
|
stats['insights'] += 1
|
||||||
|
|
||||||
|
# 8. Import photos
|
||||||
|
photo_files = [f for f in zf.namelist() if f.startswith('photos/') and not f.endswith('/')]
|
||||||
|
for photo_file in photo_files:
|
||||||
|
# Extract date from filename (format: YYYY-MM-DD_N.jpg)
|
||||||
|
filename = Path(photo_file).name
|
||||||
|
parts = filename.split('_')
|
||||||
|
photo_date = parts[0] if len(parts) > 0 else datetime.now().strftime('%Y-%m-%d')
|
||||||
|
|
||||||
|
# Generate new ID and path
|
||||||
|
photo_id = str(uuid.uuid4())
|
||||||
|
ext = Path(filename).suffix
|
||||||
|
new_filename = f"{photo_id}{ext}"
|
||||||
|
target_path = PHOTOS_DIR / new_filename
|
||||||
|
|
||||||
|
# Check if photo already exists for this date
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id FROM photos
|
||||||
|
WHERE profile_id = %s AND date = %s
|
||||||
|
""", (pid, photo_date))
|
||||||
|
|
||||||
|
if cur.fetchone() is None:
|
||||||
|
# Write photo file
|
||||||
|
with open(target_path, 'wb') as f:
|
||||||
|
f.write(zf.read(photo_file))
|
||||||
|
|
||||||
|
# Insert DB record
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO photos (id, profile_id, date, path, created)
|
||||||
|
VALUES (%s, %s, %s, %s, %s)
|
||||||
|
""", (photo_id, pid, photo_date, new_filename, datetime.now()))
|
||||||
|
stats['photos'] += 1
|
||||||
|
|
||||||
|
# Commit transaction
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
# Rollback on any error
|
||||||
|
conn.rollback()
|
||||||
|
raise HTTPException(500, f"Import fehlgeschlagen: {str(e)}")
|
||||||
|
|
||||||
|
# Phase 2: Increment usage counter
|
||||||
|
increment_feature_usage(pid, 'data_import')
|
||||||
|
|
||||||
|
return {
|
||||||
|
"ok": True,
|
||||||
|
"message": "Import erfolgreich",
|
||||||
|
"stats": stats,
|
||||||
|
"total": sum(stats.values())
|
||||||
|
}
|
||||||
|
|
||||||
|
except zipfile.BadZipFile:
|
||||||
|
raise HTTPException(400, "Ungültige ZIP-Datei")
|
||||||
|
except Exception as e:
|
||||||
|
raise HTTPException(500, f"Import-Fehler: {str(e)}")
|
||||||
671
backend/routers/insights.py
Normal file
671
backend/routers/insights.py
Normal file
|
|
@ -0,0 +1,671 @@
|
||||||
|
"""
|
||||||
|
AI Insights Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
Handles AI analysis execution, prompt management, and usage tracking.
|
||||||
|
"""
|
||||||
|
import os
|
||||||
|
import json
|
||||||
|
import uuid
|
||||||
|
import logging
|
||||||
|
import httpx
|
||||||
|
from typing import Optional
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
from fastapi import APIRouter, HTTPException, Header, Depends
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_auth, require_admin, check_feature_access, increment_feature_usage
|
||||||
|
from routers.profiles import get_pid
|
||||||
|
from feature_logger import log_feature_usage
|
||||||
|
from quality_filter import get_quality_filter_sql
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api", tags=["insights"])
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
OPENROUTER_KEY = os.getenv("OPENROUTER_API_KEY", "")
|
||||||
|
OPENROUTER_MODEL = os.getenv("OPENROUTER_MODEL", "anthropic/claude-sonnet-4")
|
||||||
|
ANTHROPIC_KEY = os.getenv("ANTHROPIC_API_KEY", "")
|
||||||
|
|
||||||
|
|
||||||
|
# ── Helper Functions ──────────────────────────────────────────────────────────
|
||||||
|
def check_ai_limit(pid: str):
|
||||||
|
"""Check if profile has reached daily AI limit. Returns (allowed, limit, used)."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT ai_enabled, ai_limit_day FROM profiles WHERE id=%s", (pid,))
|
||||||
|
prof = cur.fetchone()
|
||||||
|
if not prof or not prof['ai_enabled']:
|
||||||
|
raise HTTPException(403, "KI ist für dieses Profil deaktiviert")
|
||||||
|
limit = prof['ai_limit_day']
|
||||||
|
if limit is None:
|
||||||
|
return (True, None, 0)
|
||||||
|
today = datetime.now().date().isoformat()
|
||||||
|
cur.execute("SELECT call_count FROM ai_usage WHERE profile_id=%s AND date=%s", (pid, today))
|
||||||
|
usage = cur.fetchone()
|
||||||
|
used = usage['call_count'] if usage else 0
|
||||||
|
if used >= limit:
|
||||||
|
raise HTTPException(429, f"Tägliches KI-Limit erreicht ({limit} Calls)")
|
||||||
|
return (True, limit, used)
|
||||||
|
|
||||||
|
|
||||||
|
def inc_ai_usage(pid: str):
|
||||||
|
"""Increment AI usage counter for today."""
|
||||||
|
today = datetime.now().date().isoformat()
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT id, call_count FROM ai_usage WHERE profile_id=%s AND date=%s", (pid, today))
|
||||||
|
row = cur.fetchone()
|
||||||
|
if row:
|
||||||
|
cur.execute("UPDATE ai_usage SET call_count=%s WHERE id=%s", (row['call_count']+1, row['id']))
|
||||||
|
else:
|
||||||
|
cur.execute("INSERT INTO ai_usage (id, profile_id, date, call_count) VALUES (%s,%s,%s,1)",
|
||||||
|
(str(uuid.uuid4()), pid, today))
|
||||||
|
|
||||||
|
|
||||||
|
def _get_profile_data(pid: str):
|
||||||
|
"""Fetch all relevant data for AI analysis."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT * FROM profiles WHERE id=%s", (pid,))
|
||||||
|
prof = r2d(cur.fetchone())
|
||||||
|
|
||||||
|
# Issue #31: Get global quality filter setting
|
||||||
|
quality_filter = get_quality_filter_sql(prof)
|
||||||
|
cur.execute("SELECT * FROM weight_log WHERE profile_id=%s ORDER BY date DESC LIMIT 90", (pid,))
|
||||||
|
weight = [r2d(r) for r in cur.fetchall()]
|
||||||
|
cur.execute("SELECT * FROM circumference_log WHERE profile_id=%s ORDER BY date DESC LIMIT 30", (pid,))
|
||||||
|
circ = [r2d(r) for r in cur.fetchall()]
|
||||||
|
cur.execute("SELECT * FROM caliper_log WHERE profile_id=%s ORDER BY date DESC LIMIT 30", (pid,))
|
||||||
|
caliper = [r2d(r) for r in cur.fetchall()]
|
||||||
|
cur.execute("SELECT * FROM nutrition_log WHERE profile_id=%s ORDER BY date DESC LIMIT 90", (pid,))
|
||||||
|
nutrition = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
# Issue #31: Global quality filter (from user profile setting)
|
||||||
|
cur.execute(f"""
|
||||||
|
SELECT * FROM activity_log
|
||||||
|
WHERE profile_id=%s
|
||||||
|
{quality_filter}
|
||||||
|
ORDER BY date DESC LIMIT 90
|
||||||
|
""", (pid,))
|
||||||
|
activity = [r2d(r) for r in cur.fetchall()]
|
||||||
|
# v9d Phase 2: Sleep, Rest Days, Vitals
|
||||||
|
cur.execute("SELECT * FROM sleep_log WHERE profile_id=%s ORDER BY date DESC LIMIT 30", (pid,))
|
||||||
|
sleep = [r2d(r) for r in cur.fetchall()]
|
||||||
|
cur.execute("SELECT * FROM rest_days WHERE profile_id=%s ORDER BY date DESC LIMIT 30", (pid,))
|
||||||
|
rest_days = [r2d(r) for r in cur.fetchall()]
|
||||||
|
# v9d Phase 2d Refactored: separate baseline and BP tables
|
||||||
|
cur.execute("SELECT * FROM vitals_baseline WHERE profile_id=%s ORDER BY date DESC LIMIT 30", (pid,))
|
||||||
|
vitals_baseline = [r2d(r) for r in cur.fetchall()]
|
||||||
|
cur.execute("SELECT * FROM blood_pressure_log WHERE profile_id=%s ORDER BY measured_at DESC LIMIT 90", (pid,))
|
||||||
|
blood_pressure = [r2d(r) for r in cur.fetchall()]
|
||||||
|
return {
|
||||||
|
"profile": prof,
|
||||||
|
"weight": weight,
|
||||||
|
"circumference": circ,
|
||||||
|
"caliper": caliper,
|
||||||
|
"nutrition": nutrition,
|
||||||
|
"activity": activity,
|
||||||
|
"sleep": sleep,
|
||||||
|
"rest_days": rest_days,
|
||||||
|
"vitals_baseline": vitals_baseline,
|
||||||
|
"blood_pressure": blood_pressure
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def _render_template(template: str, data: dict) -> str:
|
||||||
|
"""Simple template variable replacement."""
|
||||||
|
result = template
|
||||||
|
for k, v in data.items():
|
||||||
|
result = result.replace(f"{{{{{k}}}}}", str(v) if v is not None else "")
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def _prepare_template_vars(data: dict) -> dict:
|
||||||
|
"""Prepare template variables from profile data."""
|
||||||
|
prof = data['profile']
|
||||||
|
weight = data['weight']
|
||||||
|
circ = data['circumference']
|
||||||
|
caliper = data['caliper']
|
||||||
|
nutrition = data['nutrition']
|
||||||
|
activity = data['activity']
|
||||||
|
sleep = data.get('sleep', [])
|
||||||
|
rest_days = data.get('rest_days', [])
|
||||||
|
vitals_baseline = data.get('vitals_baseline', [])
|
||||||
|
blood_pressure = data.get('blood_pressure', [])
|
||||||
|
|
||||||
|
vars = {
|
||||||
|
"name": prof.get('name', 'Nutzer'),
|
||||||
|
"geschlecht": "männlich" if prof.get('sex') == 'm' else "weiblich",
|
||||||
|
"height": prof.get('height', 178),
|
||||||
|
"goal_weight": float(prof.get('goal_weight')) if prof.get('goal_weight') else "nicht gesetzt",
|
||||||
|
"goal_bf_pct": float(prof.get('goal_bf_pct')) if prof.get('goal_bf_pct') else "nicht gesetzt",
|
||||||
|
"weight_aktuell": float(weight[0]['weight']) if weight else "keine Daten",
|
||||||
|
"kf_aktuell": float(caliper[0]['body_fat_pct']) if caliper and caliper[0].get('body_fat_pct') else "unbekannt",
|
||||||
|
}
|
||||||
|
|
||||||
|
# Calculate age from dob
|
||||||
|
if prof.get('dob'):
|
||||||
|
try:
|
||||||
|
from datetime import date
|
||||||
|
dob = datetime.strptime(prof['dob'], '%Y-%m-%d').date()
|
||||||
|
today = date.today()
|
||||||
|
age = today.year - dob.year - ((today.month, today.day) < (dob.month, dob.day))
|
||||||
|
vars['age'] = age
|
||||||
|
except:
|
||||||
|
vars['age'] = "unbekannt"
|
||||||
|
else:
|
||||||
|
vars['age'] = "unbekannt"
|
||||||
|
|
||||||
|
# Weight trend summary
|
||||||
|
if len(weight) >= 2:
|
||||||
|
recent = weight[:30]
|
||||||
|
delta = float(recent[0]['weight']) - float(recent[-1]['weight'])
|
||||||
|
vars['weight_trend'] = f"{len(recent)} Einträge, Δ30d: {delta:+.1f}kg"
|
||||||
|
else:
|
||||||
|
vars['weight_trend'] = "zu wenig Daten"
|
||||||
|
|
||||||
|
# Caliper summary
|
||||||
|
if caliper:
|
||||||
|
c = caliper[0]
|
||||||
|
bf = float(c.get('body_fat_pct')) if c.get('body_fat_pct') else '?'
|
||||||
|
vars['caliper_summary'] = f"KF: {bf}%, Methode: {c.get('sf_method','?')}"
|
||||||
|
else:
|
||||||
|
vars['caliper_summary'] = "keine Daten"
|
||||||
|
|
||||||
|
# Circumference summary
|
||||||
|
if circ:
|
||||||
|
c = circ[0]
|
||||||
|
parts = []
|
||||||
|
for k in ['c_waist', 'c_belly', 'c_hip']:
|
||||||
|
if c.get(k): parts.append(f"{k.split('_')[1]}: {float(c[k])}cm")
|
||||||
|
vars['circ_summary'] = ", ".join(parts) if parts else "keine Daten"
|
||||||
|
else:
|
||||||
|
vars['circ_summary'] = "keine Daten"
|
||||||
|
|
||||||
|
# Nutrition summary
|
||||||
|
if nutrition:
|
||||||
|
n = len(nutrition)
|
||||||
|
avg_kcal = sum(float(d.get('kcal',0) or 0) for d in nutrition) / n
|
||||||
|
avg_prot = sum(float(d.get('protein_g',0) or 0) for d in nutrition) / n
|
||||||
|
vars['nutrition_summary'] = f"{n} Tage, Ø {avg_kcal:.0f}kcal, {avg_prot:.0f}g Protein"
|
||||||
|
vars['nutrition_detail'] = vars['nutrition_summary']
|
||||||
|
vars['nutrition_days'] = n
|
||||||
|
vars['kcal_avg'] = round(avg_kcal)
|
||||||
|
vars['protein_avg'] = round(avg_prot,1)
|
||||||
|
vars['fat_avg'] = round(sum(float(d.get('fat_g',0) or 0) for d in nutrition) / n,1)
|
||||||
|
vars['carb_avg'] = round(sum(float(d.get('carbs_g',0) or 0) for d in nutrition) / n,1)
|
||||||
|
else:
|
||||||
|
vars['nutrition_summary'] = "keine Daten"
|
||||||
|
vars['nutrition_detail'] = "keine Daten"
|
||||||
|
vars['nutrition_days'] = 0
|
||||||
|
vars['kcal_avg'] = 0
|
||||||
|
vars['protein_avg'] = 0
|
||||||
|
vars['fat_avg'] = 0
|
||||||
|
vars['carb_avg'] = 0
|
||||||
|
|
||||||
|
# Protein targets
|
||||||
|
w = weight[0]['weight'] if weight else prof.get('height',178) - 100
|
||||||
|
w = float(w) # Convert Decimal to float for math operations
|
||||||
|
vars['protein_ziel_low'] = round(w * 1.6)
|
||||||
|
vars['protein_ziel_high'] = round(w * 2.2)
|
||||||
|
|
||||||
|
# Activity summary
|
||||||
|
if activity:
|
||||||
|
n = len(activity)
|
||||||
|
total_kcal = sum(float(a.get('kcal_active',0) or 0) for a in activity)
|
||||||
|
vars['activity_summary'] = f"{n} Trainings, {total_kcal:.0f}kcal gesamt"
|
||||||
|
vars['activity_detail'] = vars['activity_summary']
|
||||||
|
vars['activity_kcal_summary'] = f"Ø {total_kcal/n:.0f}kcal/Training"
|
||||||
|
else:
|
||||||
|
vars['activity_summary'] = "keine Daten"
|
||||||
|
vars['activity_detail'] = "keine Daten"
|
||||||
|
vars['activity_kcal_summary'] = "keine Daten"
|
||||||
|
|
||||||
|
# Sleep summary (v9d Phase 2b)
|
||||||
|
if sleep:
|
||||||
|
n = len(sleep)
|
||||||
|
avg_duration = sum(float(s.get('duration_minutes',0) or 0) for s in sleep) / n
|
||||||
|
avg_quality = sum(int(s.get('quality',0) or 0) for s in sleep if s.get('quality')) / max(sum(1 for s in sleep if s.get('quality')), 1)
|
||||||
|
deep_data = [s for s in sleep if s.get('deep_minutes')]
|
||||||
|
avg_deep = sum(float(s.get('deep_minutes',0)) for s in deep_data) / len(deep_data) if deep_data else 0
|
||||||
|
vars['sleep_summary'] = f"{n} Nächte, Ø {avg_duration/60:.1f}h Schlafdauer, Qualität {avg_quality:.1f}/5"
|
||||||
|
vars['sleep_detail'] = f"Ø {avg_duration:.0f}min gesamt, {avg_deep:.0f}min Tiefschlaf"
|
||||||
|
vars['sleep_avg_duration'] = round(avg_duration)
|
||||||
|
vars['sleep_avg_quality'] = round(avg_quality, 1)
|
||||||
|
vars['sleep_nights'] = n
|
||||||
|
else:
|
||||||
|
vars['sleep_summary'] = "keine Daten"
|
||||||
|
vars['sleep_detail'] = "keine Daten"
|
||||||
|
vars['sleep_avg_duration'] = 0
|
||||||
|
vars['sleep_avg_quality'] = 0
|
||||||
|
vars['sleep_nights'] = 0
|
||||||
|
|
||||||
|
# Rest Days summary (v9d Phase 2a)
|
||||||
|
if rest_days:
|
||||||
|
n = len(rest_days)
|
||||||
|
types = {}
|
||||||
|
for rd in rest_days:
|
||||||
|
rt = rd.get('rest_type', 'unknown')
|
||||||
|
types[rt] = types.get(rt, 0) + 1
|
||||||
|
type_summary = ", ".join([f"{k}: {v}x" for k, v in types.items()])
|
||||||
|
vars['rest_days_summary'] = f"{n} Ruhetage (letzte 30d): {type_summary}"
|
||||||
|
vars['rest_days_count'] = n
|
||||||
|
vars['rest_days_types'] = type_summary
|
||||||
|
else:
|
||||||
|
vars['rest_days_summary'] = "keine Daten"
|
||||||
|
vars['rest_days_count'] = 0
|
||||||
|
vars['rest_days_types'] = "keine"
|
||||||
|
|
||||||
|
# Vitals Baseline summary (v9d Phase 2d Refactored)
|
||||||
|
if vitals_baseline:
|
||||||
|
n = len(vitals_baseline)
|
||||||
|
hr_data = [v for v in vitals_baseline if v.get('resting_hr')]
|
||||||
|
hrv_data = [v for v in vitals_baseline if v.get('hrv')]
|
||||||
|
vo2_data = [v for v in vitals_baseline if v.get('vo2_max')]
|
||||||
|
|
||||||
|
avg_hr = sum(int(v.get('resting_hr')) for v in hr_data) / len(hr_data) if hr_data else 0
|
||||||
|
avg_hrv = sum(int(v.get('hrv')) for v in hrv_data) / len(hrv_data) if hrv_data else 0
|
||||||
|
latest_vo2 = float(vo2_data[0].get('vo2_max')) if vo2_data else 0
|
||||||
|
|
||||||
|
parts = []
|
||||||
|
if avg_hr: parts.append(f"Ruhepuls Ø {avg_hr:.0f}bpm")
|
||||||
|
if avg_hrv: parts.append(f"HRV Ø {avg_hrv:.0f}ms")
|
||||||
|
if latest_vo2: parts.append(f"VO2 Max {latest_vo2:.1f}")
|
||||||
|
|
||||||
|
vars['vitals_summary'] = f"{n} Messungen: " + ", ".join(parts) if parts else "keine verwertbaren Daten"
|
||||||
|
vars['vitals_detail'] = vars['vitals_summary']
|
||||||
|
vars['vitals_avg_hr'] = round(avg_hr)
|
||||||
|
vars['vitals_avg_hrv'] = round(avg_hrv)
|
||||||
|
vars['vitals_vo2_max'] = round(latest_vo2, 1) if latest_vo2 else "k.A."
|
||||||
|
else:
|
||||||
|
vars['vitals_summary'] = "keine Daten"
|
||||||
|
vars['vitals_detail'] = "keine Daten"
|
||||||
|
vars['vitals_avg_hr'] = 0
|
||||||
|
vars['vitals_avg_hrv'] = 0
|
||||||
|
vars['vitals_vo2_max'] = "k.A."
|
||||||
|
|
||||||
|
# Blood Pressure summary (v9d Phase 2d Refactored)
|
||||||
|
if blood_pressure:
|
||||||
|
n = len(blood_pressure)
|
||||||
|
bp_data = [bp for bp in blood_pressure if bp.get('systolic') and bp.get('diastolic')]
|
||||||
|
|
||||||
|
avg_bp_sys = sum(int(bp.get('systolic')) for bp in bp_data) / len(bp_data) if bp_data else 0
|
||||||
|
avg_bp_dia = sum(int(bp.get('diastolic')) for bp in bp_data) / len(bp_data) if bp_data else 0
|
||||||
|
|
||||||
|
vars['vitals_avg_bp'] = f"{round(avg_bp_sys)}/{round(avg_bp_dia)}" if avg_bp_sys else "k.A."
|
||||||
|
vars['bp_summary'] = f"{n} Messungen, Ø {avg_bp_sys:.0f}/{avg_bp_dia:.0f} mmHg" if avg_bp_sys else "keine Daten"
|
||||||
|
else:
|
||||||
|
vars['vitals_avg_bp'] = "k.A."
|
||||||
|
vars['bp_summary'] = "keine Daten"
|
||||||
|
|
||||||
|
return vars
|
||||||
|
|
||||||
|
|
||||||
|
# ── Endpoints ─────────────────────────────────────────────────────────────────
|
||||||
|
@router.get("/insights")
|
||||||
|
def get_all_insights(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Get all AI insights for profile."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT * FROM ai_insights WHERE profile_id=%s ORDER BY created DESC", (pid,))
|
||||||
|
rows = cur.fetchall()
|
||||||
|
return [r2d(r) for r in rows]
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/insights/latest")
|
||||||
|
def get_latest_insights(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Get latest AI insights across all scopes."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT * FROM ai_insights WHERE profile_id=%s ORDER BY created DESC LIMIT 10", (pid,))
|
||||||
|
rows = cur.fetchall()
|
||||||
|
return [r2d(r) for r in rows]
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/ai/insights/{scope}")
|
||||||
|
def get_ai_insight(scope: str, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Get latest insight for specific scope."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT * FROM ai_insights WHERE profile_id=%s AND scope=%s ORDER BY created DESC LIMIT 1", (pid,scope))
|
||||||
|
row = cur.fetchone()
|
||||||
|
if not row: return None
|
||||||
|
return r2d(row)
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/insights/{insight_id}")
|
||||||
|
def delete_insight_by_id(insight_id: str, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Delete a specific insight by ID."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("DELETE FROM ai_insights WHERE id=%s AND profile_id=%s", (insight_id, pid))
|
||||||
|
return {"ok":True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/ai/insights/{scope}")
|
||||||
|
def delete_ai_insight(scope: str, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Delete all insights for specific scope."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("DELETE FROM ai_insights WHERE profile_id=%s AND scope=%s", (pid,scope))
|
||||||
|
return {"ok":True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/insights/run/{slug}")
|
||||||
|
async def analyze_with_prompt(slug: str, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Run AI analysis with specified prompt template."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
# Phase 4: Check feature access and ENFORCE
|
||||||
|
access = check_feature_access(pid, 'ai_calls')
|
||||||
|
log_feature_usage(pid, 'ai_calls', access, 'analyze')
|
||||||
|
|
||||||
|
if not access['allowed']:
|
||||||
|
logger.warning(
|
||||||
|
f"[FEATURE-LIMIT] User {pid} blocked: "
|
||||||
|
f"ai_calls {access['reason']} (used: {access['used']}, limit: {access['limit']})"
|
||||||
|
)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=403,
|
||||||
|
detail=f"Limit erreicht: Du hast das Kontingent für KI-Analysen überschritten ({access['used']}/{access['limit']}). "
|
||||||
|
f"Bitte kontaktiere den Admin oder warte bis zum nächsten Reset."
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get prompt template
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT * FROM ai_prompts WHERE slug=%s AND active=true", (slug,))
|
||||||
|
prompt_row = cur.fetchone()
|
||||||
|
if not prompt_row:
|
||||||
|
raise HTTPException(404, f"Prompt '{slug}' nicht gefunden")
|
||||||
|
|
||||||
|
prompt_tmpl = prompt_row['template']
|
||||||
|
data = _get_profile_data(pid)
|
||||||
|
vars = _prepare_template_vars(data)
|
||||||
|
final_prompt = _render_template(prompt_tmpl, vars)
|
||||||
|
|
||||||
|
# Call AI
|
||||||
|
if ANTHROPIC_KEY:
|
||||||
|
# Use Anthropic SDK
|
||||||
|
import anthropic
|
||||||
|
client = anthropic.Anthropic(api_key=ANTHROPIC_KEY)
|
||||||
|
response = client.messages.create(
|
||||||
|
model="claude-sonnet-4-20250514",
|
||||||
|
max_tokens=2000,
|
||||||
|
messages=[{"role": "user", "content": final_prompt}]
|
||||||
|
)
|
||||||
|
content = response.content[0].text
|
||||||
|
elif OPENROUTER_KEY:
|
||||||
|
async with httpx.AsyncClient() as client:
|
||||||
|
resp = await client.post("https://openrouter.ai/api/v1/chat/completions",
|
||||||
|
headers={"Authorization": f"Bearer {OPENROUTER_KEY}"},
|
||||||
|
json={
|
||||||
|
"model": OPENROUTER_MODEL,
|
||||||
|
"messages": [{"role": "user", "content": final_prompt}],
|
||||||
|
"max_tokens": 2000
|
||||||
|
},
|
||||||
|
timeout=60.0
|
||||||
|
)
|
||||||
|
if resp.status_code != 200:
|
||||||
|
raise HTTPException(500, f"KI-Fehler: {resp.text}")
|
||||||
|
content = resp.json()['choices'][0]['message']['content']
|
||||||
|
else:
|
||||||
|
raise HTTPException(500, "Keine KI-API konfiguriert")
|
||||||
|
|
||||||
|
# Save insight (with history - no DELETE)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("INSERT INTO ai_insights (id, profile_id, scope, content, created) VALUES (%s,%s,%s,%s,CURRENT_TIMESTAMP)",
|
||||||
|
(str(uuid.uuid4()), pid, slug, content))
|
||||||
|
|
||||||
|
# Phase 2: Increment new feature usage counter
|
||||||
|
increment_feature_usage(pid, 'ai_calls')
|
||||||
|
|
||||||
|
# Old usage tracking (keep for now)
|
||||||
|
inc_ai_usage(pid)
|
||||||
|
|
||||||
|
return {"scope": slug, "content": content}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/insights/pipeline")
|
||||||
|
async def analyze_pipeline(
|
||||||
|
config_id: Optional[str] = None,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Run configurable multi-stage pipeline analysis.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
config_id: Pipeline config ID (optional, uses default if not specified)
|
||||||
|
"""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
# Phase 4: Check pipeline feature access (boolean - enabled/disabled)
|
||||||
|
access_pipeline = check_feature_access(pid, 'ai_pipeline')
|
||||||
|
log_feature_usage(pid, 'ai_pipeline', access_pipeline, 'pipeline')
|
||||||
|
|
||||||
|
if not access_pipeline['allowed']:
|
||||||
|
logger.warning(
|
||||||
|
f"[FEATURE-LIMIT] User {pid} blocked: "
|
||||||
|
f"ai_pipeline {access_pipeline['reason']}"
|
||||||
|
)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=403,
|
||||||
|
detail=f"Pipeline-Analyse ist nicht verfügbar. Bitte kontaktiere den Admin."
|
||||||
|
)
|
||||||
|
|
||||||
|
# Also check ai_calls (pipeline uses API calls too)
|
||||||
|
access_calls = check_feature_access(pid, 'ai_calls')
|
||||||
|
log_feature_usage(pid, 'ai_calls', access_calls, 'pipeline_calls')
|
||||||
|
|
||||||
|
if not access_calls['allowed']:
|
||||||
|
logger.warning(
|
||||||
|
f"[FEATURE-LIMIT] User {pid} blocked: "
|
||||||
|
f"ai_calls {access_calls['reason']} (used: {access_calls['used']}, limit: {access_calls['limit']})"
|
||||||
|
)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=403,
|
||||||
|
detail=f"Limit erreicht: Du hast das Kontingent für KI-Analysen überschritten ({access_calls['used']}/{access_calls['limit']}). "
|
||||||
|
f"Bitte kontaktiere den Admin oder warte bis zum nächsten Reset."
|
||||||
|
)
|
||||||
|
|
||||||
|
# Load pipeline config
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
if config_id:
|
||||||
|
cur.execute("SELECT * FROM pipeline_configs WHERE id=%s AND active=true", (config_id,))
|
||||||
|
else:
|
||||||
|
cur.execute("SELECT * FROM pipeline_configs WHERE is_default=true AND active=true")
|
||||||
|
|
||||||
|
config = r2d(cur.fetchone())
|
||||||
|
if not config:
|
||||||
|
raise HTTPException(404, "Pipeline-Konfiguration nicht gefunden")
|
||||||
|
|
||||||
|
logger.info(f"[PIPELINE] Using config '{config['name']}' (id={config['id']})")
|
||||||
|
|
||||||
|
data = _get_profile_data(pid)
|
||||||
|
vars = _prepare_template_vars(data)
|
||||||
|
|
||||||
|
# Stage 1: Load and execute prompts from config
|
||||||
|
stage1_prompts = []
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
for slug in config['stage1_prompts']:
|
||||||
|
cur.execute("SELECT slug, template FROM ai_prompts WHERE slug=%s AND active=true", (slug,))
|
||||||
|
prompt = r2d(cur.fetchone())
|
||||||
|
if prompt:
|
||||||
|
stage1_prompts.append(prompt)
|
||||||
|
else:
|
||||||
|
logger.warning(f"[PIPELINE] Stage 1 prompt '{slug}' not found or inactive")
|
||||||
|
|
||||||
|
stage1_results = {}
|
||||||
|
for p in stage1_prompts:
|
||||||
|
slug = p['slug']
|
||||||
|
final_prompt = _render_template(p['template'], vars)
|
||||||
|
|
||||||
|
if ANTHROPIC_KEY:
|
||||||
|
import anthropic
|
||||||
|
client = anthropic.Anthropic(api_key=ANTHROPIC_KEY)
|
||||||
|
response = client.messages.create(
|
||||||
|
model="claude-sonnet-4-20250514",
|
||||||
|
max_tokens=1000,
|
||||||
|
messages=[{"role": "user", "content": final_prompt}]
|
||||||
|
)
|
||||||
|
content = response.content[0].text.strip()
|
||||||
|
elif OPENROUTER_KEY:
|
||||||
|
async with httpx.AsyncClient() as client:
|
||||||
|
resp = await client.post("https://openrouter.ai/api/v1/chat/completions",
|
||||||
|
headers={"Authorization": f"Bearer {OPENROUTER_KEY}"},
|
||||||
|
json={
|
||||||
|
"model": OPENROUTER_MODEL,
|
||||||
|
"messages": [{"role": "user", "content": final_prompt}],
|
||||||
|
"max_tokens": 1000
|
||||||
|
},
|
||||||
|
timeout=60.0
|
||||||
|
)
|
||||||
|
content = resp.json()['choices'][0]['message']['content'].strip()
|
||||||
|
else:
|
||||||
|
raise HTTPException(500, "Keine KI-API konfiguriert")
|
||||||
|
|
||||||
|
# Try to parse JSON, fallback to raw text
|
||||||
|
try:
|
||||||
|
stage1_results[slug] = json.loads(content)
|
||||||
|
except:
|
||||||
|
stage1_results[slug] = content
|
||||||
|
|
||||||
|
# Stage 2: Synthesis with dynamic placeholders
|
||||||
|
# Inject all stage1 results as {{stage1_<slug>}} placeholders
|
||||||
|
for slug, result in stage1_results.items():
|
||||||
|
# Convert slug like "pipeline_body" to placeholder name "stage1_body"
|
||||||
|
placeholder_name = slug.replace('pipeline_', 'stage1_')
|
||||||
|
vars[placeholder_name] = json.dumps(result, ensure_ascii=False) if isinstance(result, dict) else str(result)
|
||||||
|
|
||||||
|
# Load stage 2 prompt from config
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT template FROM ai_prompts WHERE slug=%s AND active=true", (config['stage2_prompt'],))
|
||||||
|
synth_row = cur.fetchone()
|
||||||
|
if not synth_row:
|
||||||
|
raise HTTPException(500, f"Pipeline synthesis prompt '{config['stage2_prompt']}' not found")
|
||||||
|
|
||||||
|
synth_prompt = _render_template(synth_row['template'], vars)
|
||||||
|
|
||||||
|
if ANTHROPIC_KEY:
|
||||||
|
import anthropic
|
||||||
|
client = anthropic.Anthropic(api_key=ANTHROPIC_KEY)
|
||||||
|
response = client.messages.create(
|
||||||
|
model="claude-sonnet-4-20250514",
|
||||||
|
max_tokens=2000,
|
||||||
|
messages=[{"role": "user", "content": synth_prompt}]
|
||||||
|
)
|
||||||
|
synthesis = response.content[0].text
|
||||||
|
elif OPENROUTER_KEY:
|
||||||
|
async with httpx.AsyncClient() as client:
|
||||||
|
resp = await client.post("https://openrouter.ai/api/v1/chat/completions",
|
||||||
|
headers={"Authorization": f"Bearer {OPENROUTER_KEY}"},
|
||||||
|
json={
|
||||||
|
"model": OPENROUTER_MODEL,
|
||||||
|
"messages": [{"role": "user", "content": synth_prompt}],
|
||||||
|
"max_tokens": 2000
|
||||||
|
},
|
||||||
|
timeout=60.0
|
||||||
|
)
|
||||||
|
synthesis = resp.json()['choices'][0]['message']['content']
|
||||||
|
else:
|
||||||
|
raise HTTPException(500, "Keine KI-API konfiguriert")
|
||||||
|
|
||||||
|
# Stage 3: Optional (e.g., Goals)
|
||||||
|
goals_text = None
|
||||||
|
if config.get('stage3_prompt'):
|
||||||
|
# Check if conditions are met (for backwards compatibility with goals check)
|
||||||
|
prof = data['profile']
|
||||||
|
should_run_stage3 = True
|
||||||
|
|
||||||
|
# Special case: goals prompt only runs if goals are set
|
||||||
|
if config['stage3_prompt'] == 'pipeline_goals':
|
||||||
|
should_run_stage3 = bool(prof.get('goal_weight') or prof.get('goal_bf_pct'))
|
||||||
|
|
||||||
|
if should_run_stage3:
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT template FROM ai_prompts WHERE slug=%s AND active=true", (config['stage3_prompt'],))
|
||||||
|
goals_row = cur.fetchone()
|
||||||
|
if goals_row:
|
||||||
|
goals_prompt = _render_template(goals_row['template'], vars)
|
||||||
|
|
||||||
|
if ANTHROPIC_KEY:
|
||||||
|
import anthropic
|
||||||
|
client = anthropic.Anthropic(api_key=ANTHROPIC_KEY)
|
||||||
|
response = client.messages.create(
|
||||||
|
model="claude-sonnet-4-20250514",
|
||||||
|
max_tokens=800,
|
||||||
|
messages=[{"role": "user", "content": goals_prompt}]
|
||||||
|
)
|
||||||
|
goals_text = response.content[0].text
|
||||||
|
elif OPENROUTER_KEY:
|
||||||
|
async with httpx.AsyncClient() as client:
|
||||||
|
resp = await client.post("https://openrouter.ai/api/v1/chat/completions",
|
||||||
|
headers={"Authorization": f"Bearer {OPENROUTER_KEY}"},
|
||||||
|
json={
|
||||||
|
"model": OPENROUTER_MODEL,
|
||||||
|
"messages": [{"role": "user", "content": goals_prompt}],
|
||||||
|
"max_tokens": 800
|
||||||
|
},
|
||||||
|
timeout=60.0
|
||||||
|
)
|
||||||
|
goals_text = resp.json()['choices'][0]['message']['content']
|
||||||
|
|
||||||
|
# Combine synthesis + goals
|
||||||
|
final_content = synthesis
|
||||||
|
if goals_text:
|
||||||
|
final_content += "\n\n" + goals_text
|
||||||
|
|
||||||
|
# Save with config-specific scope (with history - no DELETE)
|
||||||
|
scope = f"pipeline_{config['name'].lower().replace(' ', '_')}"
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("INSERT INTO ai_insights (id, profile_id, scope, content, created) VALUES (%s,%s,%s,%s,CURRENT_TIMESTAMP)",
|
||||||
|
(str(uuid.uuid4()), pid, scope, final_content))
|
||||||
|
|
||||||
|
logger.info(f"[PIPELINE] Completed '{config['name']}' - saved as scope='{scope}'")
|
||||||
|
|
||||||
|
# Phase 2: Increment ai_calls usage (pipeline uses multiple API calls)
|
||||||
|
# Note: We increment once per pipeline run, not per individual call
|
||||||
|
increment_feature_usage(pid, 'ai_calls')
|
||||||
|
|
||||||
|
# Old usage tracking (keep for now)
|
||||||
|
inc_ai_usage(pid)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"scope": scope,
|
||||||
|
"content": final_content,
|
||||||
|
"stage1": stage1_results,
|
||||||
|
"config": {
|
||||||
|
"id": config['id'],
|
||||||
|
"name": config['name']
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/ai/usage")
|
||||||
|
def get_ai_usage(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Get AI usage stats for current profile."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT ai_limit_day FROM profiles WHERE id=%s", (pid,))
|
||||||
|
prof = cur.fetchone()
|
||||||
|
limit = prof['ai_limit_day'] if prof else None
|
||||||
|
|
||||||
|
today = datetime.now().date().isoformat()
|
||||||
|
cur.execute("SELECT call_count FROM ai_usage WHERE profile_id=%s AND date=%s", (pid, today))
|
||||||
|
usage = cur.fetchone()
|
||||||
|
used = usage['call_count'] if usage else 0
|
||||||
|
|
||||||
|
return {"limit": limit, "used": used, "remaining": (limit - used) if limit else None}
|
||||||
289
backend/routers/nutrition.py
Normal file
289
backend/routers/nutrition.py
Normal file
|
|
@ -0,0 +1,289 @@
|
||||||
|
"""
|
||||||
|
Nutrition Tracking Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
Handles nutrition data, FDDB CSV import, correlations, and weekly aggregates.
|
||||||
|
"""
|
||||||
|
import csv
|
||||||
|
import io
|
||||||
|
import uuid
|
||||||
|
import logging
|
||||||
|
from typing import Optional
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
from fastapi import APIRouter, HTTPException, UploadFile, File, Header, Depends
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_auth, check_feature_access, increment_feature_usage
|
||||||
|
from routers.profiles import get_pid
|
||||||
|
from feature_logger import log_feature_usage
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/nutrition", tags=["nutrition"])
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
# ── Helper ────────────────────────────────────────────────────────────────────
|
||||||
|
def _pf(s):
|
||||||
|
"""Parse float from string (handles comma decimal separator)."""
|
||||||
|
try: return float(str(s).replace(',','.').strip())
|
||||||
|
except: return 0.0
|
||||||
|
|
||||||
|
|
||||||
|
# ── Endpoints ─────────────────────────────────────────────────────────────────
|
||||||
|
@router.post("/import-csv")
|
||||||
|
async def import_nutrition_csv(file: UploadFile=File(...), x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Import FDDB nutrition CSV."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
# Phase 4: Check feature access and ENFORCE
|
||||||
|
# Note: CSV import can create many entries - we check once before import
|
||||||
|
access = check_feature_access(pid, 'nutrition_entries')
|
||||||
|
log_feature_usage(pid, 'nutrition_entries', access, 'import_csv')
|
||||||
|
|
||||||
|
if not access['allowed']:
|
||||||
|
logger.warning(
|
||||||
|
f"[FEATURE-LIMIT] User {pid} blocked: "
|
||||||
|
f"nutrition_entries {access['reason']} (used: {access['used']}, limit: {access['limit']})"
|
||||||
|
)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=403,
|
||||||
|
detail=f"Limit erreicht: Du hast das Kontingent für Ernährungseinträge überschritten ({access['used']}/{access['limit']}). "
|
||||||
|
f"Bitte kontaktiere den Admin oder warte bis zum nächsten Reset."
|
||||||
|
)
|
||||||
|
|
||||||
|
raw = await file.read()
|
||||||
|
try: text = raw.decode('utf-8')
|
||||||
|
except: text = raw.decode('latin-1')
|
||||||
|
if text.startswith('\ufeff'): text = text[1:]
|
||||||
|
if not text.strip(): raise HTTPException(400,"Leere Datei")
|
||||||
|
reader = csv.DictReader(io.StringIO(text), delimiter=';')
|
||||||
|
days: dict = {}
|
||||||
|
count = 0
|
||||||
|
for row in reader:
|
||||||
|
rd = row.get('datum_tag_monat_jahr_stunde_minute','').strip().strip('"')
|
||||||
|
if not rd: continue
|
||||||
|
try:
|
||||||
|
p = rd.split(' ')[0].split('.')
|
||||||
|
iso = f"{p[2]}-{p[1]}-{p[0]}"
|
||||||
|
except: continue
|
||||||
|
days.setdefault(iso,{'kcal':0,'fat_g':0,'carbs_g':0,'protein_g':0})
|
||||||
|
days[iso]['kcal'] += _pf(row.get('kj',0))/4.184
|
||||||
|
days[iso]['fat_g'] += _pf(row.get('fett_g',0))
|
||||||
|
days[iso]['carbs_g'] += _pf(row.get('kh_g',0))
|
||||||
|
days[iso]['protein_g'] += _pf(row.get('protein_g',0))
|
||||||
|
count+=1
|
||||||
|
inserted=0
|
||||||
|
new_entries=0
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
for iso,vals in days.items():
|
||||||
|
kcal=round(vals['kcal'],1); fat=round(vals['fat_g'],1)
|
||||||
|
carbs=round(vals['carbs_g'],1); prot=round(vals['protein_g'],1)
|
||||||
|
cur.execute("SELECT id FROM nutrition_log WHERE profile_id=%s AND date=%s",(pid,iso))
|
||||||
|
is_new = not cur.fetchone()
|
||||||
|
if not is_new:
|
||||||
|
# UPDATE existing
|
||||||
|
cur.execute("UPDATE nutrition_log SET kcal=%s,protein_g=%s,fat_g=%s,carbs_g=%s WHERE profile_id=%s AND date=%s",
|
||||||
|
(kcal,prot,fat,carbs,pid,iso))
|
||||||
|
else:
|
||||||
|
# INSERT new
|
||||||
|
cur.execute("INSERT INTO nutrition_log (id,profile_id,date,kcal,protein_g,fat_g,carbs_g,source,created) VALUES (%s,%s,%s,%s,%s,%s,%s,'csv',CURRENT_TIMESTAMP)",
|
||||||
|
(str(uuid.uuid4()),pid,iso,kcal,prot,fat,carbs))
|
||||||
|
new_entries += 1
|
||||||
|
inserted+=1
|
||||||
|
|
||||||
|
# Phase 2: Increment usage counter for each new entry created
|
||||||
|
for _ in range(new_entries):
|
||||||
|
increment_feature_usage(pid, 'nutrition_entries')
|
||||||
|
|
||||||
|
return {"rows_parsed":count,"days_imported":inserted,"new_entries":new_entries,
|
||||||
|
"date_range":{"from":min(days) if days else None,"to":max(days) if days else None}}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("")
|
||||||
|
def create_nutrition(date: str, kcal: float, protein_g: float, fat_g: float, carbs_g: float,
|
||||||
|
x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Create or update nutrition entry for a specific date."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
# Validate date format
|
||||||
|
try:
|
||||||
|
datetime.strptime(date, '%Y-%m-%d')
|
||||||
|
except ValueError:
|
||||||
|
raise HTTPException(400, "Ungültiges Datumsformat. Erwartet: YYYY-MM-DD")
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
# Check if entry exists
|
||||||
|
cur.execute("SELECT id FROM nutrition_log WHERE profile_id=%s AND date=%s", (pid, date))
|
||||||
|
existing = cur.fetchone()
|
||||||
|
|
||||||
|
if existing:
|
||||||
|
# UPDATE existing entry
|
||||||
|
cur.execute("""
|
||||||
|
UPDATE nutrition_log
|
||||||
|
SET kcal=%s, protein_g=%s, fat_g=%s, carbs_g=%s, source='manual'
|
||||||
|
WHERE id=%s AND profile_id=%s
|
||||||
|
""", (round(kcal,1), round(protein_g,1), round(fat_g,1), round(carbs_g,1), existing['id'], pid))
|
||||||
|
return {"success": True, "mode": "updated", "id": existing['id']}
|
||||||
|
else:
|
||||||
|
# Phase 4: Check feature access before INSERT
|
||||||
|
access = check_feature_access(pid, 'nutrition_entries')
|
||||||
|
log_feature_usage(pid, 'nutrition_entries', access, 'create')
|
||||||
|
|
||||||
|
if not access['allowed']:
|
||||||
|
logger.warning(
|
||||||
|
f"[FEATURE-LIMIT] User {pid} blocked: "
|
||||||
|
f"nutrition_entries {access['reason']} (used: {access['used']}, limit: {access['limit']})"
|
||||||
|
)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=403,
|
||||||
|
detail=f"Limit erreicht: Du hast das Kontingent für Ernährungseinträge überschritten ({access['used']}/{access['limit']}). "
|
||||||
|
f"Bitte kontaktiere den Admin oder warte bis zum nächsten Reset."
|
||||||
|
)
|
||||||
|
|
||||||
|
# INSERT new entry
|
||||||
|
new_id = str(uuid.uuid4())
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO nutrition_log (id, profile_id, date, kcal, protein_g, fat_g, carbs_g, source, created)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s, 'manual', CURRENT_TIMESTAMP)
|
||||||
|
""", (new_id, pid, date, round(kcal,1), round(protein_g,1), round(fat_g,1), round(carbs_g,1)))
|
||||||
|
|
||||||
|
# Phase 2: Increment usage counter
|
||||||
|
increment_feature_usage(pid, 'nutrition_entries')
|
||||||
|
|
||||||
|
return {"success": True, "mode": "created", "id": new_id}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def list_nutrition(limit: int=365, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Get nutrition entries for current profile."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"SELECT * FROM nutrition_log WHERE profile_id=%s ORDER BY date DESC LIMIT %s", (pid,limit))
|
||||||
|
return [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/by-date/{date}")
|
||||||
|
def get_nutrition_by_date(date: str, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Get nutrition entry for a specific date."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT * FROM nutrition_log WHERE profile_id=%s AND date=%s", (pid, date))
|
||||||
|
row = cur.fetchone()
|
||||||
|
return r2d(row) if row else None
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/correlations")
|
||||||
|
def nutrition_correlations(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Get nutrition data correlated with weight and body fat."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT * FROM nutrition_log WHERE profile_id=%s ORDER BY date",(pid,))
|
||||||
|
nutr={r['date']:r2d(r) for r in cur.fetchall()}
|
||||||
|
cur.execute("SELECT date,weight FROM weight_log WHERE profile_id=%s ORDER BY date",(pid,))
|
||||||
|
wlog={r['date']:r['weight'] for r in cur.fetchall()}
|
||||||
|
cur.execute("SELECT date,lean_mass,body_fat_pct FROM caliper_log WHERE profile_id=%s ORDER BY date",(pid,))
|
||||||
|
cals=sorted([r2d(r) for r in cur.fetchall()],key=lambda x:x['date'])
|
||||||
|
all_dates=sorted(set(list(nutr)+list(wlog)))
|
||||||
|
mi,last_cal,cal_by_date=0,{},{}
|
||||||
|
for d in all_dates:
|
||||||
|
while mi<len(cals) and cals[mi]['date']<=d: last_cal=cals[mi]; mi+=1
|
||||||
|
if last_cal: cal_by_date[d]=last_cal
|
||||||
|
result=[]
|
||||||
|
for d in all_dates:
|
||||||
|
if d not in nutr and d not in wlog: continue
|
||||||
|
row={'date':d}
|
||||||
|
if d in nutr: row.update({k:float(nutr[d][k]) if nutr[d][k] is not None else None for k in ['kcal','protein_g','fat_g','carbs_g']})
|
||||||
|
if d in wlog: row['weight']=float(wlog[d])
|
||||||
|
if d in cal_by_date:
|
||||||
|
lm = cal_by_date[d].get('lean_mass')
|
||||||
|
bf = cal_by_date[d].get('body_fat_pct')
|
||||||
|
row['lean_mass']=float(lm) if lm is not None else None
|
||||||
|
row['body_fat_pct']=float(bf) if bf is not None else None
|
||||||
|
result.append(row)
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/weekly")
|
||||||
|
def nutrition_weekly(weeks: int=16, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Get nutrition data aggregated by week."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT * FROM nutrition_log WHERE profile_id=%s ORDER BY date DESC LIMIT %s",(pid,weeks*7))
|
||||||
|
rows=[r2d(r) for r in cur.fetchall()]
|
||||||
|
if not rows: return []
|
||||||
|
wm={}
|
||||||
|
for d in rows:
|
||||||
|
# Handle both datetime.date objects (from DB) and strings
|
||||||
|
date_obj = d['date'] if hasattr(d['date'], 'strftime') else datetime.strptime(d['date'],'%Y-%m-%d')
|
||||||
|
wk = date_obj.strftime('%Y-W%V')
|
||||||
|
wm.setdefault(wk,[]).append(d)
|
||||||
|
result=[]
|
||||||
|
for wk in sorted(wm):
|
||||||
|
en=wm[wk]; n=len(en)
|
||||||
|
def avg(k): return round(sum(float(e.get(k) or 0) for e in en)/n,1)
|
||||||
|
result.append({'week':wk,'days':n,'kcal':avg('kcal'),'protein_g':avg('protein_g'),'fat_g':avg('fat_g'),'carbs_g':avg('carbs_g')})
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/import-history")
|
||||||
|
def import_history(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Get import history by grouping entries by created timestamp."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT
|
||||||
|
DATE(created) as import_date,
|
||||||
|
COUNT(*) as count,
|
||||||
|
MIN(date) as date_from,
|
||||||
|
MAX(date) as date_to,
|
||||||
|
MAX(created) as last_created
|
||||||
|
FROM nutrition_log
|
||||||
|
WHERE profile_id=%s AND source='csv'
|
||||||
|
GROUP BY DATE(created)
|
||||||
|
ORDER BY DATE(created) DESC
|
||||||
|
""", (pid,))
|
||||||
|
return [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{entry_id}")
|
||||||
|
def update_nutrition(entry_id: str, kcal: float, protein_g: float, fat_g: float, carbs_g: float,
|
||||||
|
x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Update nutrition entry macros."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
# Verify ownership
|
||||||
|
cur.execute("SELECT id FROM nutrition_log WHERE id=%s AND profile_id=%s", (entry_id, pid))
|
||||||
|
if not cur.fetchone():
|
||||||
|
raise HTTPException(404, "Eintrag nicht gefunden")
|
||||||
|
|
||||||
|
cur.execute("""
|
||||||
|
UPDATE nutrition_log
|
||||||
|
SET kcal=%s, protein_g=%s, fat_g=%s, carbs_g=%s
|
||||||
|
WHERE id=%s AND profile_id=%s
|
||||||
|
""", (round(kcal,1), round(protein_g,1), round(fat_g,1), round(carbs_g,1), entry_id, pid))
|
||||||
|
|
||||||
|
return {"success": True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{entry_id}")
|
||||||
|
def delete_nutrition(entry_id: str, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Delete nutrition entry."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
# Verify ownership
|
||||||
|
cur.execute("SELECT id FROM nutrition_log WHERE id=%s AND profile_id=%s", (entry_id, pid))
|
||||||
|
if not cur.fetchone():
|
||||||
|
raise HTTPException(404, "Eintrag nicht gefunden")
|
||||||
|
|
||||||
|
cur.execute("DELETE FROM nutrition_log WHERE id=%s AND profile_id=%s", (entry_id, pid))
|
||||||
|
|
||||||
|
return {"success": True}
|
||||||
90
backend/routers/photos.py
Normal file
90
backend/routers/photos.py
Normal file
|
|
@ -0,0 +1,90 @@
|
||||||
|
"""
|
||||||
|
Photo Management Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
Handles progress photo uploads and retrieval.
|
||||||
|
"""
|
||||||
|
import os
|
||||||
|
import uuid
|
||||||
|
import logging
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from fastapi import APIRouter, UploadFile, File, Form, Header, HTTPException, Depends
|
||||||
|
from fastapi.responses import FileResponse
|
||||||
|
import aiofiles
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_auth, require_auth_flexible, check_feature_access, increment_feature_usage
|
||||||
|
from routers.profiles import get_pid
|
||||||
|
from feature_logger import log_feature_usage
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/photos", tags=["photos"])
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
PHOTOS_DIR = Path(os.getenv("PHOTOS_DIR", "./photos"))
|
||||||
|
PHOTOS_DIR.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("")
|
||||||
|
async def upload_photo(file: UploadFile=File(...), date: str=Form(""),
|
||||||
|
x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Upload progress photo."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
# Phase 4: Check feature access and ENFORCE
|
||||||
|
access = check_feature_access(pid, 'photos')
|
||||||
|
log_feature_usage(pid, 'photos', access, 'upload')
|
||||||
|
|
||||||
|
if not access['allowed']:
|
||||||
|
logger.warning(
|
||||||
|
f"[FEATURE-LIMIT] User {pid} blocked: "
|
||||||
|
f"photos {access['reason']} (used: {access['used']}, limit: {access['limit']})"
|
||||||
|
)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=403,
|
||||||
|
detail=f"Limit erreicht: Du hast das Kontingent für Fotos überschritten ({access['used']}/{access['limit']}). "
|
||||||
|
f"Bitte kontaktiere den Admin oder warte bis zum nächsten Reset."
|
||||||
|
)
|
||||||
|
|
||||||
|
fid = str(uuid.uuid4())
|
||||||
|
ext = Path(file.filename).suffix or '.jpg'
|
||||||
|
path = PHOTOS_DIR / f"{fid}{ext}"
|
||||||
|
async with aiofiles.open(path,'wb') as f: await f.write(await file.read())
|
||||||
|
|
||||||
|
# Convert empty string to NULL for date field
|
||||||
|
photo_date = date if date and date.strip() else None
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("INSERT INTO photos (id,profile_id,date,path,created) VALUES (%s,%s,%s,%s,CURRENT_TIMESTAMP)",
|
||||||
|
(fid,pid,photo_date,str(path)))
|
||||||
|
|
||||||
|
# Phase 2: Increment usage counter
|
||||||
|
increment_feature_usage(pid, 'photos')
|
||||||
|
|
||||||
|
return {"id":fid,"date":photo_date}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{fid}")
|
||||||
|
def get_photo(fid: str, session: dict=Depends(require_auth_flexible)):
|
||||||
|
"""Get photo by ID. Auth via header or query param (for <img> tags)."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT path FROM photos WHERE id=%s", (fid,))
|
||||||
|
row = cur.fetchone()
|
||||||
|
if not row: raise HTTPException(404, "Photo not found")
|
||||||
|
photo_path = Path(PHOTOS_DIR) / row['path']
|
||||||
|
if not photo_path.exists():
|
||||||
|
raise HTTPException(404, "Photo file not found")
|
||||||
|
return FileResponse(photo_path)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def list_photos(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Get all photos for current profile."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"SELECT * FROM photos WHERE profile_id=%s ORDER BY created DESC LIMIT 100", (pid,))
|
||||||
|
return [r2d(r) for r in cur.fetchall()]
|
||||||
107
backend/routers/profiles.py
Normal file
107
backend/routers/profiles.py
Normal file
|
|
@ -0,0 +1,107 @@
|
||||||
|
"""
|
||||||
|
Profile Management Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
Handles profile CRUD operations for both admin and current user.
|
||||||
|
"""
|
||||||
|
import uuid
|
||||||
|
from typing import Optional
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
from fastapi import APIRouter, HTTPException, Header, Depends
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_auth
|
||||||
|
from models import ProfileCreate, ProfileUpdate
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api", tags=["profiles"])
|
||||||
|
|
||||||
|
|
||||||
|
# ── Helper ────────────────────────────────────────────────────────────────────
|
||||||
|
def get_pid(x_profile_id: Optional[str] = Header(default=None)) -> str:
|
||||||
|
"""Get profile_id - from header for legacy endpoints."""
|
||||||
|
if x_profile_id:
|
||||||
|
return x_profile_id
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT id FROM profiles ORDER BY created LIMIT 1")
|
||||||
|
row = cur.fetchone()
|
||||||
|
if row: return row['id']
|
||||||
|
raise HTTPException(400, "Kein Profil gefunden")
|
||||||
|
|
||||||
|
|
||||||
|
# ── Admin Profile Management ──────────────────────────────────────────────────
|
||||||
|
@router.get("/profiles")
|
||||||
|
def list_profiles(session=Depends(require_auth)):
|
||||||
|
"""List all profiles (admin)."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT * FROM profiles ORDER BY created")
|
||||||
|
rows = cur.fetchall()
|
||||||
|
return [r2d(r) for r in rows]
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/profiles")
|
||||||
|
def create_profile(p: ProfileCreate, session=Depends(require_auth)):
|
||||||
|
"""Create new profile (admin)."""
|
||||||
|
pid = str(uuid.uuid4())
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""INSERT INTO profiles (id,name,avatar_color,sex,dob,height,goal_weight,goal_bf_pct,created,updated)
|
||||||
|
VALUES (%s,%s,%s,%s,%s,%s,%s,%s,CURRENT_TIMESTAMP,CURRENT_TIMESTAMP)""",
|
||||||
|
(pid,p.name,p.avatar_color,p.sex,p.dob,p.height,p.goal_weight,p.goal_bf_pct))
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT * FROM profiles WHERE id=%s", (pid,))
|
||||||
|
return r2d(cur.fetchone())
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/profiles/{pid}")
|
||||||
|
def get_profile(pid: str, session=Depends(require_auth)):
|
||||||
|
"""Get profile by ID."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT * FROM profiles WHERE id=%s", (pid,))
|
||||||
|
row = cur.fetchone()
|
||||||
|
if not row: raise HTTPException(404, "Profil nicht gefunden")
|
||||||
|
return r2d(row)
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/profiles/{pid}")
|
||||||
|
def update_profile(pid: str, p: ProfileUpdate, session=Depends(require_auth)):
|
||||||
|
"""Update profile by ID (admin)."""
|
||||||
|
with get_db() as conn:
|
||||||
|
data = {k:v for k,v in p.model_dump().items() if v is not None}
|
||||||
|
data['updated'] = datetime.now().isoformat()
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(f"UPDATE profiles SET {', '.join(f'{k}=%s' for k in data)} WHERE id=%s",
|
||||||
|
list(data.values())+[pid])
|
||||||
|
return get_profile(pid, session)
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/profiles/{pid}")
|
||||||
|
def delete_profile(pid: str, session=Depends(require_auth)):
|
||||||
|
"""Delete profile (admin)."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT COUNT(*) as count FROM profiles")
|
||||||
|
count = cur.fetchone()['count']
|
||||||
|
if count <= 1: raise HTTPException(400, "Letztes Profil kann nicht gelöscht werden")
|
||||||
|
for table in ['weight_log','circumference_log','caliper_log','nutrition_log','activity_log','ai_insights']:
|
||||||
|
cur.execute(f"DELETE FROM {table} WHERE profile_id=%s", (pid,))
|
||||||
|
cur.execute("DELETE FROM profiles WHERE id=%s", (pid,))
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
|
||||||
|
# ── Current User Profile ──────────────────────────────────────────────────────
|
||||||
|
@router.get("/profile")
|
||||||
|
def get_active_profile(x_profile_id: Optional[str] = Header(default=None), session: dict = Depends(require_auth)):
|
||||||
|
"""Legacy endpoint – returns active profile."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
return get_profile(pid, session)
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/profile")
|
||||||
|
def update_active_profile(p: ProfileUpdate, x_profile_id: Optional[str] = Header(default=None), session: dict = Depends(require_auth)):
|
||||||
|
"""Update current user's profile."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
return update_profile(pid, p, session)
|
||||||
1235
backend/routers/prompts.py
Normal file
1235
backend/routers/prompts.py
Normal file
File diff suppressed because it is too large
Load Diff
368
backend/routers/rest_days.py
Normal file
368
backend/routers/rest_days.py
Normal file
|
|
@ -0,0 +1,368 @@
|
||||||
|
"""
|
||||||
|
Rest Days Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
Context-specific rest days with flexible JSONB configuration.
|
||||||
|
"""
|
||||||
|
import logging
|
||||||
|
from typing import Optional, Literal
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
|
||||||
|
from fastapi import APIRouter, HTTPException, Depends, Header
|
||||||
|
from pydantic import BaseModel, Field
|
||||||
|
from psycopg2.extras import Json
|
||||||
|
from psycopg2.errors import UniqueViolation
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_auth
|
||||||
|
from routers.profiles import get_pid
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/rest-days", tags=["rest-days"])
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
# ── Models ────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
class RestConfig(BaseModel):
|
||||||
|
focus: Literal['muscle_recovery', 'cardio_recovery', 'mental_rest', 'deload', 'injury']
|
||||||
|
rest_from: list[str] = Field(default_factory=list, description="Training type IDs to avoid")
|
||||||
|
allows: list[str] = Field(default_factory=list, description="Allowed activity type IDs")
|
||||||
|
intensity_max: Optional[int] = Field(None, ge=1, le=100, description="Max HR% for allowed activities")
|
||||||
|
note: str = ""
|
||||||
|
|
||||||
|
|
||||||
|
class RestDayCreate(BaseModel):
|
||||||
|
date: str # YYYY-MM-DD
|
||||||
|
rest_config: RestConfig
|
||||||
|
note: str = ""
|
||||||
|
|
||||||
|
|
||||||
|
class RestDayUpdate(BaseModel):
|
||||||
|
date: Optional[str] = None
|
||||||
|
rest_config: Optional[RestConfig] = None
|
||||||
|
note: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
|
class ActivityConflictCheck(BaseModel):
|
||||||
|
date: str
|
||||||
|
activity_type: str
|
||||||
|
|
||||||
|
|
||||||
|
# ── CRUD Endpoints ────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def list_rest_days(
|
||||||
|
limit: int = 90,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""List rest days for current profile (last N days)."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"""
|
||||||
|
SELECT id, profile_id, date, rest_config, note, created_at
|
||||||
|
FROM rest_days
|
||||||
|
WHERE profile_id = %s
|
||||||
|
ORDER BY date DESC
|
||||||
|
LIMIT %s
|
||||||
|
""",
|
||||||
|
(pid, limit)
|
||||||
|
)
|
||||||
|
return [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("")
|
||||||
|
def create_rest_day(
|
||||||
|
data: RestDayCreate,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Create rest day with JSONB config. Upserts by date."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
# Validate date format
|
||||||
|
try:
|
||||||
|
datetime.strptime(data.date, '%Y-%m-%d')
|
||||||
|
except ValueError:
|
||||||
|
raise HTTPException(400, "Invalid date format. Use YYYY-MM-DD")
|
||||||
|
|
||||||
|
# Convert RestConfig to dict for JSONB storage
|
||||||
|
config_dict = data.rest_config.model_dump()
|
||||||
|
focus = data.rest_config.focus
|
||||||
|
|
||||||
|
try:
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Insert (multiple entries per date allowed, but not same focus)
|
||||||
|
cur.execute(
|
||||||
|
"""
|
||||||
|
INSERT INTO rest_days (profile_id, date, focus, rest_config, note, created_at)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, CURRENT_TIMESTAMP)
|
||||||
|
RETURNING id, profile_id, date, focus, rest_config, note, created_at
|
||||||
|
""",
|
||||||
|
(pid, data.date, focus, Json(config_dict), data.note)
|
||||||
|
)
|
||||||
|
|
||||||
|
result = cur.fetchone()
|
||||||
|
return r2d(result)
|
||||||
|
except UniqueViolation:
|
||||||
|
# User-friendly error for duplicate focus
|
||||||
|
focus_labels = {
|
||||||
|
'muscle_recovery': 'Muskelregeneration',
|
||||||
|
'cardio_recovery': 'Cardio-Erholung',
|
||||||
|
'mental_rest': 'Mentale Erholung',
|
||||||
|
'deload': 'Deload',
|
||||||
|
'injury': 'Verletzungspause',
|
||||||
|
}
|
||||||
|
focus_label = focus_labels.get(focus, focus)
|
||||||
|
raise HTTPException(
|
||||||
|
400,
|
||||||
|
f"Du hast bereits einen Ruhetag '{focus_label}' für {data.date}. Bitte wähle einen anderen Typ oder lösche den bestehenden Eintrag."
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{rest_day_id}")
|
||||||
|
def get_rest_day(
|
||||||
|
rest_day_id: int,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Get single rest day by ID."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"""
|
||||||
|
SELECT id, profile_id, date, rest_config, note, created_at
|
||||||
|
FROM rest_days
|
||||||
|
WHERE id = %s AND profile_id = %s
|
||||||
|
""",
|
||||||
|
(rest_day_id, pid)
|
||||||
|
)
|
||||||
|
|
||||||
|
row = cur.fetchone()
|
||||||
|
if not row:
|
||||||
|
raise HTTPException(404, "Rest day not found")
|
||||||
|
|
||||||
|
return r2d(row)
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{rest_day_id}")
|
||||||
|
def update_rest_day(
|
||||||
|
rest_day_id: int,
|
||||||
|
data: RestDayUpdate,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Update rest day."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
# Build update fields dynamically
|
||||||
|
updates = []
|
||||||
|
values = []
|
||||||
|
|
||||||
|
if data.date:
|
||||||
|
try:
|
||||||
|
datetime.strptime(data.date, '%Y-%m-%d')
|
||||||
|
except ValueError:
|
||||||
|
raise HTTPException(400, "Invalid date format. Use YYYY-MM-DD")
|
||||||
|
updates.append("date = %s")
|
||||||
|
values.append(data.date)
|
||||||
|
|
||||||
|
if data.rest_config:
|
||||||
|
updates.append("rest_config = %s")
|
||||||
|
values.append(Json(data.rest_config.model_dump()))
|
||||||
|
# Also update focus column if config changed
|
||||||
|
updates.append("focus = %s")
|
||||||
|
values.append(data.rest_config.focus)
|
||||||
|
|
||||||
|
if data.note is not None:
|
||||||
|
updates.append("note = %s")
|
||||||
|
values.append(data.note)
|
||||||
|
|
||||||
|
if not updates:
|
||||||
|
raise HTTPException(400, "No fields to update")
|
||||||
|
|
||||||
|
values.extend([rest_day_id, pid])
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
f"""
|
||||||
|
UPDATE rest_days
|
||||||
|
SET {', '.join(updates)}
|
||||||
|
WHERE id = %s AND profile_id = %s
|
||||||
|
RETURNING id, profile_id, date, rest_config, note, created_at
|
||||||
|
""",
|
||||||
|
values
|
||||||
|
)
|
||||||
|
|
||||||
|
result = cur.fetchone()
|
||||||
|
if not result:
|
||||||
|
raise HTTPException(404, "Rest day not found")
|
||||||
|
|
||||||
|
return r2d(result)
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{rest_day_id}")
|
||||||
|
def delete_rest_day(
|
||||||
|
rest_day_id: int,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Delete rest day."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"DELETE FROM rest_days WHERE id = %s AND profile_id = %s RETURNING id",
|
||||||
|
(rest_day_id, pid)
|
||||||
|
)
|
||||||
|
|
||||||
|
result = cur.fetchone()
|
||||||
|
if not result:
|
||||||
|
raise HTTPException(404, "Rest day not found")
|
||||||
|
|
||||||
|
return {"deleted": True, "id": result['id']}
|
||||||
|
|
||||||
|
|
||||||
|
# ── Stats & Validation ────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
@router.get("/stats")
|
||||||
|
def get_rest_days_stats(
|
||||||
|
weeks: int = 4,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Get rest day statistics (count per week, focus distribution)."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
cutoff_date = (datetime.now() - timedelta(weeks=weeks)).strftime('%Y-%m-%d')
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Total count
|
||||||
|
cur.execute(
|
||||||
|
"""
|
||||||
|
SELECT COUNT(*) as total
|
||||||
|
FROM rest_days
|
||||||
|
WHERE profile_id = %s AND date >= %s
|
||||||
|
""",
|
||||||
|
(pid, cutoff_date)
|
||||||
|
)
|
||||||
|
total = cur.fetchone()['total']
|
||||||
|
|
||||||
|
# Count by focus type
|
||||||
|
cur.execute(
|
||||||
|
"""
|
||||||
|
SELECT
|
||||||
|
rest_config->>'focus' as focus,
|
||||||
|
COUNT(*) as count
|
||||||
|
FROM rest_days
|
||||||
|
WHERE profile_id = %s AND date >= %s
|
||||||
|
GROUP BY rest_config->>'focus'
|
||||||
|
ORDER BY count DESC
|
||||||
|
""",
|
||||||
|
(pid, cutoff_date)
|
||||||
|
)
|
||||||
|
by_focus = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
# Count by week (ISO week number)
|
||||||
|
cur.execute(
|
||||||
|
"""
|
||||||
|
SELECT
|
||||||
|
EXTRACT(YEAR FROM date) as year,
|
||||||
|
EXTRACT(WEEK FROM date) as week,
|
||||||
|
COUNT(*) as count
|
||||||
|
FROM rest_days
|
||||||
|
WHERE profile_id = %s AND date >= %s
|
||||||
|
GROUP BY year, week
|
||||||
|
ORDER BY year DESC, week DESC
|
||||||
|
""",
|
||||||
|
(pid, cutoff_date)
|
||||||
|
)
|
||||||
|
by_week = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
return {
|
||||||
|
"total_rest_days": total,
|
||||||
|
"weeks_analyzed": weeks,
|
||||||
|
"by_focus": by_focus,
|
||||||
|
"by_week": by_week
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/validate-activity")
|
||||||
|
def validate_activity(
|
||||||
|
data: ActivityConflictCheck,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Check if activity conflicts with rest day configuration.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
- conflict: bool
|
||||||
|
- severity: 'warning' | 'info' | 'none'
|
||||||
|
- message: str
|
||||||
|
"""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"""
|
||||||
|
SELECT rest_config
|
||||||
|
FROM rest_days
|
||||||
|
WHERE profile_id = %s AND date = %s
|
||||||
|
""",
|
||||||
|
(pid, data.date)
|
||||||
|
)
|
||||||
|
|
||||||
|
row = cur.fetchone()
|
||||||
|
if not row:
|
||||||
|
return {"conflict": False, "severity": "none", "message": ""}
|
||||||
|
|
||||||
|
config = row['rest_config']
|
||||||
|
|
||||||
|
# Check if activity is in rest_from
|
||||||
|
if data.activity_type in config.get('rest_from', []):
|
||||||
|
focus_labels = {
|
||||||
|
'muscle_recovery': 'Muskelregeneration',
|
||||||
|
'cardio_recovery': 'Cardio-Erholung',
|
||||||
|
'mental_rest': 'Mentale Erholung',
|
||||||
|
'deload': 'Deload',
|
||||||
|
'injury': 'Verletzungspause'
|
||||||
|
}
|
||||||
|
focus_label = focus_labels.get(config.get('focus'), 'Ruhetag')
|
||||||
|
|
||||||
|
return {
|
||||||
|
"conflict": True,
|
||||||
|
"severity": "warning",
|
||||||
|
"message": f"Ruhetag ({focus_label}) – {data.activity_type} sollte pausiert werden. Trotzdem erfassen?"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check if activity is allowed
|
||||||
|
allows_list = config.get('allows', [])
|
||||||
|
if allows_list and data.activity_type not in allows_list:
|
||||||
|
return {
|
||||||
|
"conflict": True,
|
||||||
|
"severity": "info",
|
||||||
|
"message": f"Aktivität nicht in erlaubten Aktivitäten. Heute: {', '.join(allows_list) or 'Keine'}."
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check intensity_max (if provided in request)
|
||||||
|
intensity_max = config.get('intensity_max')
|
||||||
|
if intensity_max:
|
||||||
|
return {
|
||||||
|
"conflict": False,
|
||||||
|
"severity": "info",
|
||||||
|
"message": f"Erlaubt bei max. {intensity_max}% HFmax."
|
||||||
|
}
|
||||||
|
|
||||||
|
return {"conflict": False, "severity": "none", "message": ""}
|
||||||
660
backend/routers/sleep.py
Normal file
660
backend/routers/sleep.py
Normal file
|
|
@ -0,0 +1,660 @@
|
||||||
|
"""
|
||||||
|
Sleep Module Router (v9d Phase 2b)
|
||||||
|
|
||||||
|
Endpoints:
|
||||||
|
- CRUD: list, create/upsert, update, delete
|
||||||
|
- Stats: 7-day average, trends, phase distribution, sleep debt
|
||||||
|
- Correlations: sleep ↔ resting HR, training, weight (Phase 2e)
|
||||||
|
"""
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, UploadFile, File
|
||||||
|
from pydantic import BaseModel
|
||||||
|
from typing import Literal
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
import csv
|
||||||
|
import io
|
||||||
|
import json
|
||||||
|
|
||||||
|
from auth import require_auth
|
||||||
|
from db import get_db, get_cursor
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/sleep", tags=["sleep"])
|
||||||
|
|
||||||
|
# ── Models ────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
class SleepCreate(BaseModel):
|
||||||
|
date: str # YYYY-MM-DD
|
||||||
|
bedtime: str | None = None # HH:MM
|
||||||
|
wake_time: str | None = None # HH:MM
|
||||||
|
duration_minutes: int
|
||||||
|
quality: int | None = None # 1-5
|
||||||
|
wake_count: int | None = None
|
||||||
|
deep_minutes: int | None = None
|
||||||
|
rem_minutes: int | None = None
|
||||||
|
light_minutes: int | None = None
|
||||||
|
awake_minutes: int | None = None
|
||||||
|
note: str = ""
|
||||||
|
source: Literal['manual', 'apple_health', 'garmin'] = 'manual'
|
||||||
|
|
||||||
|
class SleepResponse(BaseModel):
|
||||||
|
id: int
|
||||||
|
profile_id: str
|
||||||
|
date: str
|
||||||
|
bedtime: str | None
|
||||||
|
wake_time: str | None
|
||||||
|
duration_minutes: int
|
||||||
|
duration_formatted: str
|
||||||
|
quality: int | None
|
||||||
|
wake_count: int | None
|
||||||
|
deep_minutes: int | None
|
||||||
|
rem_minutes: int | None
|
||||||
|
light_minutes: int | None
|
||||||
|
awake_minutes: int | None
|
||||||
|
sleep_segments: list | None
|
||||||
|
sleep_efficiency: float | None
|
||||||
|
deep_percent: float | None
|
||||||
|
rem_percent: float | None
|
||||||
|
note: str
|
||||||
|
source: str
|
||||||
|
created_at: str
|
||||||
|
|
||||||
|
class SleepStatsResponse(BaseModel):
|
||||||
|
avg_duration_minutes: float
|
||||||
|
avg_quality: float | None
|
||||||
|
total_nights: int
|
||||||
|
nights_below_goal: int
|
||||||
|
sleep_goal_minutes: int
|
||||||
|
|
||||||
|
class SleepDebtResponse(BaseModel):
|
||||||
|
sleep_debt_minutes: int
|
||||||
|
sleep_debt_formatted: str
|
||||||
|
days_analyzed: int
|
||||||
|
sleep_goal_minutes: int
|
||||||
|
|
||||||
|
# ── Helper Functions ──────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def format_duration(minutes: int) -> str:
|
||||||
|
"""Convert minutes to 'Xh Ymin' format."""
|
||||||
|
hours = minutes // 60
|
||||||
|
mins = minutes % 60
|
||||||
|
return f"{hours}h {mins}min"
|
||||||
|
|
||||||
|
def calculate_sleep_efficiency(duration_min: int, awake_min: int | None) -> float | None:
|
||||||
|
"""Sleep efficiency = duration / (duration + awake) * 100."""
|
||||||
|
if awake_min is None or awake_min == 0:
|
||||||
|
return None
|
||||||
|
total = duration_min + awake_min
|
||||||
|
return round((duration_min / total) * 100, 1) if total > 0 else None
|
||||||
|
|
||||||
|
def calculate_phase_percent(phase_min: int | None, duration_min: int) -> float | None:
|
||||||
|
"""Calculate phase percentage of total duration."""
|
||||||
|
if phase_min is None or duration_min == 0:
|
||||||
|
return None
|
||||||
|
return round((phase_min / duration_min) * 100, 1)
|
||||||
|
|
||||||
|
def row_to_sleep_response(row: dict) -> SleepResponse:
|
||||||
|
"""Convert DB row to SleepResponse."""
|
||||||
|
return SleepResponse(
|
||||||
|
id=row['id'],
|
||||||
|
profile_id=row['profile_id'],
|
||||||
|
date=str(row['date']),
|
||||||
|
bedtime=str(row['bedtime']) if row['bedtime'] else None,
|
||||||
|
wake_time=str(row['wake_time']) if row['wake_time'] else None,
|
||||||
|
duration_minutes=row['duration_minutes'],
|
||||||
|
duration_formatted=format_duration(row['duration_minutes']),
|
||||||
|
quality=row['quality'],
|
||||||
|
wake_count=row['wake_count'],
|
||||||
|
deep_minutes=row['deep_minutes'],
|
||||||
|
rem_minutes=row['rem_minutes'],
|
||||||
|
light_minutes=row['light_minutes'],
|
||||||
|
awake_minutes=row['awake_minutes'],
|
||||||
|
sleep_segments=row['sleep_segments'],
|
||||||
|
sleep_efficiency=calculate_sleep_efficiency(row['duration_minutes'], row['awake_minutes']),
|
||||||
|
deep_percent=calculate_phase_percent(row['deep_minutes'], row['duration_minutes']),
|
||||||
|
rem_percent=calculate_phase_percent(row['rem_minutes'], row['duration_minutes']),
|
||||||
|
note=row['note'] or "",
|
||||||
|
source=row['source'],
|
||||||
|
created_at=str(row['created_at'])
|
||||||
|
)
|
||||||
|
|
||||||
|
# ── CRUD Endpoints ────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def list_sleep(
|
||||||
|
limit: int = 90,
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""List sleep entries for current profile (last N days)."""
|
||||||
|
pid = session['profile_id']
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT * FROM sleep_log
|
||||||
|
WHERE profile_id = %s
|
||||||
|
ORDER BY date DESC
|
||||||
|
LIMIT %s
|
||||||
|
""", (pid, limit))
|
||||||
|
rows = cur.fetchall()
|
||||||
|
|
||||||
|
return [row_to_sleep_response(row) for row in rows]
|
||||||
|
|
||||||
|
@router.get("/by-date/{date}")
|
||||||
|
def get_sleep_by_date(
|
||||||
|
date: str,
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Get sleep entry for specific date."""
|
||||||
|
pid = session['profile_id']
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT * FROM sleep_log
|
||||||
|
WHERE profile_id = %s AND date = %s
|
||||||
|
""", (pid, date))
|
||||||
|
row = cur.fetchone()
|
||||||
|
|
||||||
|
if not row:
|
||||||
|
raise HTTPException(404, "No sleep entry for this date")
|
||||||
|
|
||||||
|
return row_to_sleep_response(row)
|
||||||
|
|
||||||
|
@router.post("")
|
||||||
|
def create_sleep(
|
||||||
|
data: SleepCreate,
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Create or update sleep entry (upsert by date)."""
|
||||||
|
pid = session['profile_id']
|
||||||
|
|
||||||
|
# Convert empty strings to None for TIME fields
|
||||||
|
bedtime = data.bedtime if data.bedtime else None
|
||||||
|
wake_time = data.wake_time if data.wake_time else None
|
||||||
|
|
||||||
|
# Plausibility check: sleep phases (deep+rem+light) should sum to duration
|
||||||
|
# Note: awake_minutes is NOT part of sleep duration (tracked separately)
|
||||||
|
if any([data.deep_minutes, data.rem_minutes, data.light_minutes]):
|
||||||
|
sleep_phase_sum = (data.deep_minutes or 0) + (data.rem_minutes or 0) + (data.light_minutes or 0)
|
||||||
|
diff = abs(data.duration_minutes - sleep_phase_sum)
|
||||||
|
if diff > 5:
|
||||||
|
raise HTTPException(
|
||||||
|
400,
|
||||||
|
f"Plausibilitätsprüfung fehlgeschlagen: Schlafphasen-Summe ({sleep_phase_sum} min) weicht um {diff} min von Schlafdauer ({data.duration_minutes} min) ab. Max. Toleranz: 5 min. Hinweis: Wachphasen werden nicht zur Schlafdauer gezählt."
|
||||||
|
)
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Upsert: INSERT ... ON CONFLICT DO UPDATE
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO sleep_log (
|
||||||
|
profile_id, date, bedtime, wake_time, duration_minutes,
|
||||||
|
quality, wake_count, deep_minutes, rem_minutes, light_minutes,
|
||||||
|
awake_minutes, note, source, updated_at
|
||||||
|
) VALUES (
|
||||||
|
%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, CURRENT_TIMESTAMP
|
||||||
|
)
|
||||||
|
ON CONFLICT (profile_id, date) DO UPDATE SET
|
||||||
|
bedtime = EXCLUDED.bedtime,
|
||||||
|
wake_time = EXCLUDED.wake_time,
|
||||||
|
duration_minutes = EXCLUDED.duration_minutes,
|
||||||
|
quality = EXCLUDED.quality,
|
||||||
|
wake_count = EXCLUDED.wake_count,
|
||||||
|
deep_minutes = EXCLUDED.deep_minutes,
|
||||||
|
rem_minutes = EXCLUDED.rem_minutes,
|
||||||
|
light_minutes = EXCLUDED.light_minutes,
|
||||||
|
awake_minutes = EXCLUDED.awake_minutes,
|
||||||
|
note = EXCLUDED.note,
|
||||||
|
source = EXCLUDED.source,
|
||||||
|
updated_at = CURRENT_TIMESTAMP
|
||||||
|
RETURNING *
|
||||||
|
""", (
|
||||||
|
pid, data.date, bedtime, wake_time, data.duration_minutes,
|
||||||
|
data.quality, data.wake_count, data.deep_minutes, data.rem_minutes,
|
||||||
|
data.light_minutes, data.awake_minutes, data.note, data.source
|
||||||
|
))
|
||||||
|
|
||||||
|
row = cur.fetchone()
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return row_to_sleep_response(row)
|
||||||
|
|
||||||
|
@router.put("/{id}")
|
||||||
|
def update_sleep(
|
||||||
|
id: int,
|
||||||
|
data: SleepCreate,
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Update existing sleep entry by ID."""
|
||||||
|
pid = session['profile_id']
|
||||||
|
|
||||||
|
# Convert empty strings to None for TIME fields
|
||||||
|
bedtime = data.bedtime if data.bedtime else None
|
||||||
|
wake_time = data.wake_time if data.wake_time else None
|
||||||
|
|
||||||
|
# Plausibility check: sleep phases (deep+rem+light) should sum to duration
|
||||||
|
# Note: awake_minutes is NOT part of sleep duration (tracked separately)
|
||||||
|
if any([data.deep_minutes, data.rem_minutes, data.light_minutes]):
|
||||||
|
sleep_phase_sum = (data.deep_minutes or 0) + (data.rem_minutes or 0) + (data.light_minutes or 0)
|
||||||
|
diff = abs(data.duration_minutes - sleep_phase_sum)
|
||||||
|
if diff > 5:
|
||||||
|
raise HTTPException(
|
||||||
|
400,
|
||||||
|
f"Plausibilitätsprüfung fehlgeschlagen: Schlafphasen-Summe ({sleep_phase_sum} min) weicht um {diff} min von Schlafdauer ({data.duration_minutes} min) ab. Max. Toleranz: 5 min. Hinweis: Wachphasen werden nicht zur Schlafdauer gezählt."
|
||||||
|
)
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
cur.execute("""
|
||||||
|
UPDATE sleep_log SET
|
||||||
|
date = %s,
|
||||||
|
bedtime = %s,
|
||||||
|
wake_time = %s,
|
||||||
|
duration_minutes = %s,
|
||||||
|
quality = %s,
|
||||||
|
wake_count = %s,
|
||||||
|
deep_minutes = %s,
|
||||||
|
rem_minutes = %s,
|
||||||
|
light_minutes = %s,
|
||||||
|
awake_minutes = %s,
|
||||||
|
note = %s,
|
||||||
|
updated_at = CURRENT_TIMESTAMP
|
||||||
|
WHERE id = %s AND profile_id = %s
|
||||||
|
RETURNING *
|
||||||
|
""", (
|
||||||
|
data.date, bedtime, wake_time, data.duration_minutes,
|
||||||
|
data.quality, data.wake_count, data.deep_minutes, data.rem_minutes,
|
||||||
|
data.light_minutes, data.awake_minutes, data.note, id, pid
|
||||||
|
))
|
||||||
|
|
||||||
|
row = cur.fetchone()
|
||||||
|
if not row:
|
||||||
|
raise HTTPException(404, "Sleep entry not found")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return row_to_sleep_response(row)
|
||||||
|
|
||||||
|
@router.delete("/{id}")
|
||||||
|
def delete_sleep(
|
||||||
|
id: int,
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Delete sleep entry."""
|
||||||
|
pid = session['profile_id']
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
DELETE FROM sleep_log
|
||||||
|
WHERE id = %s AND profile_id = %s
|
||||||
|
""", (id, pid))
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return {"deleted": id}
|
||||||
|
|
||||||
|
# ── Stats Endpoints ───────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
@router.get("/stats")
|
||||||
|
def get_sleep_stats(
|
||||||
|
days: int = 7,
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Get sleep statistics (average duration, quality, nights below goal)."""
|
||||||
|
pid = session['profile_id']
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Get sleep goal from profile
|
||||||
|
cur.execute("SELECT sleep_goal_minutes FROM profiles WHERE id = %s", (pid,))
|
||||||
|
profile = cur.fetchone()
|
||||||
|
sleep_goal = profile['sleep_goal_minutes'] if profile and profile['sleep_goal_minutes'] else 450
|
||||||
|
|
||||||
|
# Calculate stats
|
||||||
|
cur.execute("""
|
||||||
|
SELECT
|
||||||
|
AVG(duration_minutes)::FLOAT as avg_duration,
|
||||||
|
AVG(quality)::FLOAT as avg_quality,
|
||||||
|
COUNT(*) as total_nights,
|
||||||
|
COUNT(CASE WHEN duration_minutes < %s THEN 1 END) as nights_below_goal
|
||||||
|
FROM sleep_log
|
||||||
|
WHERE profile_id = %s
|
||||||
|
AND date >= CURRENT_DATE - INTERVAL '%s days'
|
||||||
|
""", (sleep_goal, pid, days))
|
||||||
|
|
||||||
|
stats = cur.fetchone()
|
||||||
|
|
||||||
|
return SleepStatsResponse(
|
||||||
|
avg_duration_minutes=round(stats['avg_duration'], 1) if stats['avg_duration'] else 0,
|
||||||
|
avg_quality=round(stats['avg_quality'], 1) if stats['avg_quality'] else None,
|
||||||
|
total_nights=stats['total_nights'],
|
||||||
|
nights_below_goal=stats['nights_below_goal'],
|
||||||
|
sleep_goal_minutes=sleep_goal
|
||||||
|
)
|
||||||
|
|
||||||
|
@router.get("/debt")
|
||||||
|
def get_sleep_debt(
|
||||||
|
days: int = 14,
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Calculate sleep debt over last N days."""
|
||||||
|
pid = session['profile_id']
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Get sleep goal
|
||||||
|
cur.execute("SELECT sleep_goal_minutes FROM profiles WHERE id = %s", (pid,))
|
||||||
|
profile = cur.fetchone()
|
||||||
|
sleep_goal = profile['sleep_goal_minutes'] if profile and profile['sleep_goal_minutes'] else 450
|
||||||
|
|
||||||
|
# Calculate debt
|
||||||
|
cur.execute("""
|
||||||
|
SELECT
|
||||||
|
SUM(%s - duration_minutes) as debt_minutes,
|
||||||
|
COUNT(*) as nights_analyzed
|
||||||
|
FROM sleep_log
|
||||||
|
WHERE profile_id = %s
|
||||||
|
AND date >= CURRENT_DATE - INTERVAL '%s days'
|
||||||
|
""", (sleep_goal, pid, days))
|
||||||
|
|
||||||
|
result = cur.fetchone()
|
||||||
|
|
||||||
|
debt_min = int(result['debt_minutes']) if result['debt_minutes'] else 0
|
||||||
|
nights = result['nights_analyzed'] if result['nights_analyzed'] else 0
|
||||||
|
|
||||||
|
# Format debt
|
||||||
|
if debt_min == 0:
|
||||||
|
formatted = "0 – kein Defizit"
|
||||||
|
elif debt_min > 0:
|
||||||
|
formatted = f"+{format_duration(debt_min)}"
|
||||||
|
else:
|
||||||
|
formatted = f"−{format_duration(abs(debt_min))}"
|
||||||
|
|
||||||
|
return SleepDebtResponse(
|
||||||
|
sleep_debt_minutes=debt_min,
|
||||||
|
sleep_debt_formatted=formatted,
|
||||||
|
days_analyzed=nights,
|
||||||
|
sleep_goal_minutes=sleep_goal
|
||||||
|
)
|
||||||
|
|
||||||
|
@router.get("/trend")
|
||||||
|
def get_sleep_trend(
|
||||||
|
days: int = 30,
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Get sleep duration and quality trend over time."""
|
||||||
|
pid = session['profile_id']
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT
|
||||||
|
date,
|
||||||
|
duration_minutes,
|
||||||
|
quality
|
||||||
|
FROM sleep_log
|
||||||
|
WHERE profile_id = %s
|
||||||
|
AND date >= CURRENT_DATE - INTERVAL '%s days'
|
||||||
|
ORDER BY date ASC
|
||||||
|
""", (pid, days))
|
||||||
|
rows = cur.fetchall()
|
||||||
|
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
"date": str(row['date']),
|
||||||
|
"duration_minutes": row['duration_minutes'],
|
||||||
|
"quality": row['quality']
|
||||||
|
}
|
||||||
|
for row in rows
|
||||||
|
]
|
||||||
|
|
||||||
|
@router.get("/phases")
|
||||||
|
def get_sleep_phases(
|
||||||
|
days: int = 30,
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Get sleep phase distribution (deep, REM, light, awake) over time."""
|
||||||
|
pid = session['profile_id']
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT
|
||||||
|
date,
|
||||||
|
deep_minutes,
|
||||||
|
rem_minutes,
|
||||||
|
light_minutes,
|
||||||
|
awake_minutes,
|
||||||
|
duration_minutes
|
||||||
|
FROM sleep_log
|
||||||
|
WHERE profile_id = %s
|
||||||
|
AND date >= CURRENT_DATE - INTERVAL '%s days'
|
||||||
|
AND (deep_minutes IS NOT NULL OR rem_minutes IS NOT NULL)
|
||||||
|
ORDER BY date ASC
|
||||||
|
""", (pid, days))
|
||||||
|
rows = cur.fetchall()
|
||||||
|
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
"date": str(row['date']),
|
||||||
|
"deep_minutes": row['deep_minutes'],
|
||||||
|
"rem_minutes": row['rem_minutes'],
|
||||||
|
"light_minutes": row['light_minutes'],
|
||||||
|
"awake_minutes": row['awake_minutes'],
|
||||||
|
"deep_percent": calculate_phase_percent(row['deep_minutes'], row['duration_minutes']),
|
||||||
|
"rem_percent": calculate_phase_percent(row['rem_minutes'], row['duration_minutes'])
|
||||||
|
}
|
||||||
|
for row in rows
|
||||||
|
]
|
||||||
|
|
||||||
|
# ── Import Endpoints ──────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
@router.post("/import/apple-health")
|
||||||
|
async def import_apple_health_sleep(
|
||||||
|
file: UploadFile = File(...),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Import sleep data from Apple Health CSV export.
|
||||||
|
|
||||||
|
Expected CSV format:
|
||||||
|
Start,End,Duration (hr),Value,Source
|
||||||
|
2026-03-14 22:44:23,2026-03-14 23:00:19,0.266,Kern,Apple Watch
|
||||||
|
|
||||||
|
- Aggregates segments by night (wake date)
|
||||||
|
- Maps German phase names: Kern→light, REM→rem, Tief→deep, Wach→awake
|
||||||
|
- Stores raw segments in JSONB
|
||||||
|
- Does NOT overwrite manual entries (source='manual')
|
||||||
|
"""
|
||||||
|
pid = session['profile_id']
|
||||||
|
|
||||||
|
# Read CSV
|
||||||
|
content = await file.read()
|
||||||
|
csv_text = content.decode('utf-8-sig') # Handle BOM
|
||||||
|
reader = csv.DictReader(io.StringIO(csv_text))
|
||||||
|
|
||||||
|
# Phase mapping (German → English)
|
||||||
|
phase_map = {
|
||||||
|
'Kern': 'light',
|
||||||
|
'REM': 'rem',
|
||||||
|
'Tief': 'deep',
|
||||||
|
'Wach': 'awake',
|
||||||
|
'Schlafend': None # Ignore initial sleep entry
|
||||||
|
}
|
||||||
|
|
||||||
|
# Parse segments
|
||||||
|
segments = []
|
||||||
|
for row in reader:
|
||||||
|
phase_de = row['Value'].strip()
|
||||||
|
phase_en = phase_map.get(phase_de)
|
||||||
|
|
||||||
|
if phase_en is None: # Skip "Schlafend"
|
||||||
|
continue
|
||||||
|
|
||||||
|
start_dt = datetime.strptime(row['Start'], '%Y-%m-%d %H:%M:%S')
|
||||||
|
end_dt = datetime.strptime(row['End'], '%Y-%m-%d %H:%M:%S')
|
||||||
|
duration_hr = float(row['Duration (hr)'])
|
||||||
|
duration_min = int(duration_hr * 60)
|
||||||
|
|
||||||
|
segments.append({
|
||||||
|
'start': start_dt,
|
||||||
|
'end': end_dt,
|
||||||
|
'duration_min': duration_min,
|
||||||
|
'phase': phase_en
|
||||||
|
})
|
||||||
|
|
||||||
|
# Sort segments chronologically
|
||||||
|
segments.sort(key=lambda s: s['start'])
|
||||||
|
|
||||||
|
# Group segments into nights (gap-based)
|
||||||
|
# If gap between segments > 2 hours → new night
|
||||||
|
nights = []
|
||||||
|
current_night = None
|
||||||
|
|
||||||
|
for seg in segments:
|
||||||
|
# Start new night if:
|
||||||
|
# 1. First segment
|
||||||
|
# 2. Gap > 2 hours since last segment
|
||||||
|
if current_night is None or (seg['start'] - current_night['wake_time']).total_seconds() > 7200:
|
||||||
|
current_night = {
|
||||||
|
'bedtime': seg['start'],
|
||||||
|
'wake_time': seg['end'],
|
||||||
|
'segments': [],
|
||||||
|
'deep_minutes': 0,
|
||||||
|
'rem_minutes': 0,
|
||||||
|
'light_minutes': 0,
|
||||||
|
'awake_minutes': 0
|
||||||
|
}
|
||||||
|
nights.append(current_night)
|
||||||
|
|
||||||
|
# Add segment to current night
|
||||||
|
current_night['segments'].append(seg)
|
||||||
|
current_night['wake_time'] = max(current_night['wake_time'], seg['end'])
|
||||||
|
current_night['bedtime'] = min(current_night['bedtime'], seg['start'])
|
||||||
|
|
||||||
|
# Sum phases
|
||||||
|
if seg['phase'] == 'deep':
|
||||||
|
current_night['deep_minutes'] += seg['duration_min']
|
||||||
|
elif seg['phase'] == 'rem':
|
||||||
|
current_night['rem_minutes'] += seg['duration_min']
|
||||||
|
elif seg['phase'] == 'light':
|
||||||
|
current_night['light_minutes'] += seg['duration_min']
|
||||||
|
elif seg['phase'] == 'awake':
|
||||||
|
current_night['awake_minutes'] += seg['duration_min']
|
||||||
|
|
||||||
|
# Convert nights list to dict with wake_date as key
|
||||||
|
nights_dict = {}
|
||||||
|
for night in nights:
|
||||||
|
wake_date = night['wake_time'].date() # Date when you woke up
|
||||||
|
nights_dict[wake_date] = night
|
||||||
|
|
||||||
|
# Insert nights
|
||||||
|
imported = 0
|
||||||
|
skipped = 0
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
for date, night in nights_dict.items():
|
||||||
|
# Calculate sleep duration (deep + rem + light, WITHOUT awake)
|
||||||
|
# Note: awake_minutes tracked separately, not part of sleep duration
|
||||||
|
duration_minutes = (
|
||||||
|
night['deep_minutes'] +
|
||||||
|
night['rem_minutes'] +
|
||||||
|
night['light_minutes']
|
||||||
|
)
|
||||||
|
|
||||||
|
# Calculate wake_count (number of awake segments)
|
||||||
|
wake_count = sum(1 for seg in night['segments'] if seg['phase'] == 'awake')
|
||||||
|
|
||||||
|
# Prepare JSONB segments with full datetime
|
||||||
|
sleep_segments = [
|
||||||
|
{
|
||||||
|
'phase': seg['phase'],
|
||||||
|
'start': seg['start'].isoformat(), # Full datetime: 2026-03-21T22:30:00
|
||||||
|
'end': seg['end'].isoformat(), # Full datetime: 2026-03-21T23:15:00
|
||||||
|
'duration_min': seg['duration_min']
|
||||||
|
}
|
||||||
|
for seg in night['segments']
|
||||||
|
]
|
||||||
|
|
||||||
|
# Check if manual entry exists - do NOT overwrite
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id, source FROM sleep_log
|
||||||
|
WHERE profile_id = %s AND date = %s
|
||||||
|
""", (pid, date))
|
||||||
|
existing = cur.fetchone()
|
||||||
|
|
||||||
|
if existing and existing['source'] == 'manual':
|
||||||
|
skipped += 1
|
||||||
|
continue # Skip - don't overwrite manual entries
|
||||||
|
|
||||||
|
# Upsert (only if not manual)
|
||||||
|
# If entry exists and is NOT manual → update
|
||||||
|
# If entry doesn't exist → insert
|
||||||
|
if existing:
|
||||||
|
# Update existing non-manual entry
|
||||||
|
cur.execute("""
|
||||||
|
UPDATE sleep_log SET
|
||||||
|
bedtime = %s,
|
||||||
|
wake_time = %s,
|
||||||
|
duration_minutes = %s,
|
||||||
|
wake_count = %s,
|
||||||
|
deep_minutes = %s,
|
||||||
|
rem_minutes = %s,
|
||||||
|
light_minutes = %s,
|
||||||
|
awake_minutes = %s,
|
||||||
|
sleep_segments = %s,
|
||||||
|
source = 'apple_health',
|
||||||
|
updated_at = CURRENT_TIMESTAMP
|
||||||
|
WHERE id = %s AND profile_id = %s
|
||||||
|
""", (
|
||||||
|
night['bedtime'].time(),
|
||||||
|
night['wake_time'].time(),
|
||||||
|
duration_minutes,
|
||||||
|
wake_count,
|
||||||
|
night['deep_minutes'],
|
||||||
|
night['rem_minutes'],
|
||||||
|
night['light_minutes'],
|
||||||
|
night['awake_minutes'],
|
||||||
|
json.dumps(sleep_segments),
|
||||||
|
existing['id'],
|
||||||
|
pid
|
||||||
|
))
|
||||||
|
else:
|
||||||
|
# Insert new entry
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO sleep_log (
|
||||||
|
profile_id, date, bedtime, wake_time, duration_minutes,
|
||||||
|
wake_count, deep_minutes, rem_minutes, light_minutes, awake_minutes,
|
||||||
|
sleep_segments, source, created_at, updated_at
|
||||||
|
) VALUES (
|
||||||
|
%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, 'apple_health', CURRENT_TIMESTAMP, CURRENT_TIMESTAMP
|
||||||
|
)
|
||||||
|
""", (
|
||||||
|
pid,
|
||||||
|
date,
|
||||||
|
night['bedtime'].time(),
|
||||||
|
night['wake_time'].time(),
|
||||||
|
duration_minutes,
|
||||||
|
wake_count,
|
||||||
|
night['deep_minutes'],
|
||||||
|
night['rem_minutes'],
|
||||||
|
night['light_minutes'],
|
||||||
|
night['awake_minutes'],
|
||||||
|
json.dumps(sleep_segments)
|
||||||
|
))
|
||||||
|
|
||||||
|
imported += 1
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"imported": imported,
|
||||||
|
"skipped": skipped,
|
||||||
|
"total_nights": len(nights_dict),
|
||||||
|
"message": f"{imported} Nächte importiert, {skipped} übersprungen (manuelle Einträge)"
|
||||||
|
}
|
||||||
39
backend/routers/stats.py
Normal file
39
backend/routers/stats.py
Normal file
|
|
@ -0,0 +1,39 @@
|
||||||
|
"""
|
||||||
|
Statistics Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
Dashboard statistics showing entry counts across all categories.
|
||||||
|
"""
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Header, Depends
|
||||||
|
|
||||||
|
from db import get_db, get_cursor
|
||||||
|
from auth import require_auth
|
||||||
|
from routers.profiles import get_pid
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api", tags=["stats"])
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/stats")
|
||||||
|
def get_stats(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Get entry counts for all tracking categories."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT COUNT(*) as count FROM weight_log WHERE profile_id=%s",(pid,))
|
||||||
|
weight_count = cur.fetchone()['count']
|
||||||
|
cur.execute("SELECT COUNT(*) as count FROM circumference_log WHERE profile_id=%s",(pid,))
|
||||||
|
circ_count = cur.fetchone()['count']
|
||||||
|
cur.execute("SELECT COUNT(*) as count FROM caliper_log WHERE profile_id=%s",(pid,))
|
||||||
|
caliper_count = cur.fetchone()['count']
|
||||||
|
cur.execute("SELECT COUNT(*) as count FROM nutrition_log WHERE profile_id=%s",(pid,))
|
||||||
|
nutrition_count = cur.fetchone()['count']
|
||||||
|
cur.execute("SELECT COUNT(*) as count FROM activity_log WHERE profile_id=%s",(pid,))
|
||||||
|
activity_count = cur.fetchone()['count']
|
||||||
|
return {
|
||||||
|
"weight_count": weight_count,
|
||||||
|
"circ_count": circ_count,
|
||||||
|
"caliper_count": caliper_count,
|
||||||
|
"nutrition_count": nutrition_count,
|
||||||
|
"activity_count": activity_count
|
||||||
|
}
|
||||||
187
backend/routers/subscription.py
Normal file
187
backend/routers/subscription.py
Normal file
|
|
@ -0,0 +1,187 @@
|
||||||
|
"""
|
||||||
|
User Subscription Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
User-facing subscription info (own tier, usage, limits).
|
||||||
|
"""
|
||||||
|
from datetime import datetime
|
||||||
|
from fastapi import APIRouter, Depends
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_auth, get_effective_tier, check_feature_access
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/subscription", tags=["subscription"])
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/me")
|
||||||
|
def get_my_subscription(session: dict = Depends(require_auth)):
|
||||||
|
"""
|
||||||
|
Get current user's subscription info.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
- tier: Current effective tier (considers access_grants)
|
||||||
|
- profile_tier: Base tier from profile
|
||||||
|
- trial_ends_at: Trial expiration (if applicable)
|
||||||
|
- email_verified: Email verification status
|
||||||
|
- active_grants: List of active access grants (coupons, trials)
|
||||||
|
"""
|
||||||
|
profile_id = session['profile_id']
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Get profile info
|
||||||
|
cur.execute("""
|
||||||
|
SELECT tier, trial_ends_at, email_verified
|
||||||
|
FROM profiles
|
||||||
|
WHERE id = %s
|
||||||
|
""", (profile_id,))
|
||||||
|
profile = cur.fetchone()
|
||||||
|
|
||||||
|
if not profile:
|
||||||
|
return {"error": "Profile not found"}
|
||||||
|
|
||||||
|
# Get effective tier (considers access_grants)
|
||||||
|
effective_tier = get_effective_tier(profile_id)
|
||||||
|
|
||||||
|
# Get active access grants
|
||||||
|
cur.execute("""
|
||||||
|
SELECT
|
||||||
|
ag.id,
|
||||||
|
ag.tier_id,
|
||||||
|
ag.granted_by,
|
||||||
|
ag.valid_from,
|
||||||
|
ag.valid_until,
|
||||||
|
ag.is_active,
|
||||||
|
ag.paused_by,
|
||||||
|
ag.remaining_days,
|
||||||
|
t.name as tier_name
|
||||||
|
FROM access_grants ag
|
||||||
|
JOIN tiers t ON t.id = ag.tier_id
|
||||||
|
WHERE ag.profile_id = %s
|
||||||
|
AND ag.valid_until > CURRENT_TIMESTAMP
|
||||||
|
ORDER BY ag.valid_until DESC
|
||||||
|
""", (profile_id,))
|
||||||
|
grants = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
# Get tier info
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id, name, description, price_monthly_cents, price_yearly_cents
|
||||||
|
FROM tiers
|
||||||
|
WHERE id = %s
|
||||||
|
""", (effective_tier,))
|
||||||
|
tier_info = r2d(cur.fetchone())
|
||||||
|
|
||||||
|
return {
|
||||||
|
"tier": effective_tier,
|
||||||
|
"tier_info": tier_info,
|
||||||
|
"profile_tier": profile['tier'],
|
||||||
|
"trial_ends_at": profile['trial_ends_at'].isoformat() if profile['trial_ends_at'] else None,
|
||||||
|
"email_verified": profile['email_verified'],
|
||||||
|
"active_grants": grants
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/usage")
|
||||||
|
def get_my_usage(session: dict = Depends(require_auth)):
|
||||||
|
"""
|
||||||
|
Get current user's feature usage.
|
||||||
|
|
||||||
|
Returns list of features with current usage and limits.
|
||||||
|
"""
|
||||||
|
profile_id = session['profile_id']
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Get all active features
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id, name, category, limit_type, reset_period
|
||||||
|
FROM features
|
||||||
|
WHERE active = true
|
||||||
|
ORDER BY category, name
|
||||||
|
""")
|
||||||
|
features = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
# Get usage for each feature
|
||||||
|
usage_list = []
|
||||||
|
for feature in features:
|
||||||
|
access = check_feature_access(profile_id, feature['id'])
|
||||||
|
usage_list.append({
|
||||||
|
"feature_id": feature['id'],
|
||||||
|
"feature_name": feature['name'],
|
||||||
|
"category": feature['category'],
|
||||||
|
"limit_type": feature['limit_type'],
|
||||||
|
"reset_period": feature['reset_period'],
|
||||||
|
"allowed": access['allowed'],
|
||||||
|
"limit": access['limit'],
|
||||||
|
"used": access['used'],
|
||||||
|
"remaining": access['remaining'],
|
||||||
|
"reason": access['reason']
|
||||||
|
})
|
||||||
|
|
||||||
|
return {
|
||||||
|
"tier": get_effective_tier(profile_id),
|
||||||
|
"features": usage_list
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/limits")
|
||||||
|
def get_my_limits(session: dict = Depends(require_auth)):
|
||||||
|
"""
|
||||||
|
Get all feature limits for current tier.
|
||||||
|
|
||||||
|
Simplified view - just shows what's allowed/not allowed.
|
||||||
|
"""
|
||||||
|
profile_id = session['profile_id']
|
||||||
|
tier_id = get_effective_tier(profile_id)
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Get all features with their limits for this tier
|
||||||
|
cur.execute("""
|
||||||
|
SELECT
|
||||||
|
f.id,
|
||||||
|
f.name,
|
||||||
|
f.category,
|
||||||
|
f.limit_type,
|
||||||
|
COALESCE(tl.limit_value, f.default_limit) as limit_value
|
||||||
|
FROM features f
|
||||||
|
LEFT JOIN tier_limits tl ON tl.feature_id = f.id AND tl.tier_id = %s
|
||||||
|
WHERE f.active = true
|
||||||
|
ORDER BY f.category, f.name
|
||||||
|
""", (tier_id,))
|
||||||
|
|
||||||
|
features = []
|
||||||
|
for row in cur.fetchall():
|
||||||
|
rd = r2d(row)
|
||||||
|
limit = rd['limit_value']
|
||||||
|
|
||||||
|
# Interpret limit
|
||||||
|
if limit is None:
|
||||||
|
status = "unlimited"
|
||||||
|
elif limit == 0:
|
||||||
|
status = "disabled"
|
||||||
|
elif rd['limit_type'] == 'boolean':
|
||||||
|
status = "enabled" if limit == 1 else "disabled"
|
||||||
|
else:
|
||||||
|
status = f"limit: {limit}"
|
||||||
|
|
||||||
|
features.append({
|
||||||
|
"feature_id": rd['id'],
|
||||||
|
"feature_name": rd['name'],
|
||||||
|
"category": rd['category'],
|
||||||
|
"limit": limit,
|
||||||
|
"status": status
|
||||||
|
})
|
||||||
|
|
||||||
|
# Get tier info
|
||||||
|
cur.execute("SELECT name, description FROM tiers WHERE id = %s", (tier_id,))
|
||||||
|
tier = cur.fetchone()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"tier_id": tier_id,
|
||||||
|
"tier_name": tier['name'] if tier else tier_id,
|
||||||
|
"tier_description": tier['description'] if tier else '',
|
||||||
|
"features": features
|
||||||
|
}
|
||||||
158
backend/routers/tier_limits.py
Normal file
158
backend/routers/tier_limits.py
Normal file
|
|
@ -0,0 +1,158 @@
|
||||||
|
"""
|
||||||
|
Tier Limits Management Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
Admin-only matrix editor for Tier x Feature limits.
|
||||||
|
"""
|
||||||
|
from fastapi import APIRouter, HTTPException, Depends
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_admin
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/tier-limits", tags=["tier-limits"])
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def get_tier_limits_matrix(session: dict = Depends(require_admin)):
|
||||||
|
"""
|
||||||
|
Admin: Get complete Tier x Feature matrix.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
{
|
||||||
|
"tiers": [{id, name}, ...],
|
||||||
|
"features": [{id, name, category}, ...],
|
||||||
|
"limits": {
|
||||||
|
"tier_id:feature_id": limit_value,
|
||||||
|
...
|
||||||
|
}
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Get all tiers (including inactive - admin needs to configure all)
|
||||||
|
cur.execute("SELECT id, name, sort_order FROM tiers ORDER BY sort_order")
|
||||||
|
tiers = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
# Get all features
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id, name, category, limit_type, default_limit, reset_period
|
||||||
|
FROM features
|
||||||
|
WHERE active = true
|
||||||
|
ORDER BY category, name
|
||||||
|
""")
|
||||||
|
features = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
# Get all tier_limits
|
||||||
|
cur.execute("SELECT tier_id, feature_id, limit_value FROM tier_limits")
|
||||||
|
limits = {}
|
||||||
|
for row in cur.fetchall():
|
||||||
|
key = f"{row['tier_id']}:{row['feature_id']}"
|
||||||
|
limits[key] = row['limit_value']
|
||||||
|
|
||||||
|
return {
|
||||||
|
"tiers": tiers,
|
||||||
|
"features": features,
|
||||||
|
"limits": limits
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("")
|
||||||
|
def update_tier_limit(data: dict, session: dict = Depends(require_admin)):
|
||||||
|
"""
|
||||||
|
Admin: Update single tier limit.
|
||||||
|
|
||||||
|
Body:
|
||||||
|
{
|
||||||
|
"tier_id": "free",
|
||||||
|
"feature_id": "weight_entries",
|
||||||
|
"limit_value": 30 // NULL = unlimited, 0 = disabled
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
tier_id = data.get('tier_id')
|
||||||
|
feature_id = data.get('feature_id')
|
||||||
|
limit_value = data.get('limit_value') # Can be None (NULL)
|
||||||
|
|
||||||
|
if not tier_id or not feature_id:
|
||||||
|
raise HTTPException(400, "tier_id und feature_id fehlen")
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Upsert tier_limit
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO tier_limits (tier_id, feature_id, limit_value)
|
||||||
|
VALUES (%s, %s, %s)
|
||||||
|
ON CONFLICT (tier_id, feature_id)
|
||||||
|
DO UPDATE SET
|
||||||
|
limit_value = EXCLUDED.limit_value,
|
||||||
|
updated = CURRENT_TIMESTAMP
|
||||||
|
""", (tier_id, feature_id, limit_value))
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/batch")
|
||||||
|
def update_tier_limits_batch(data: dict, session: dict = Depends(require_admin)):
|
||||||
|
"""
|
||||||
|
Admin: Batch update multiple tier limits.
|
||||||
|
|
||||||
|
Body:
|
||||||
|
{
|
||||||
|
"updates": [
|
||||||
|
{"tier_id": "free", "feature_id": "weight_entries", "limit_value": 30},
|
||||||
|
{"tier_id": "free", "feature_id": "ai_calls", "limit_value": 0},
|
||||||
|
...
|
||||||
|
]
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
updates = data.get('updates', [])
|
||||||
|
|
||||||
|
if not updates:
|
||||||
|
raise HTTPException(400, "updates array fehlt")
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
for update in updates:
|
||||||
|
tier_id = update.get('tier_id')
|
||||||
|
feature_id = update.get('feature_id')
|
||||||
|
limit_value = update.get('limit_value')
|
||||||
|
|
||||||
|
if not tier_id or not feature_id:
|
||||||
|
continue # Skip invalid entries
|
||||||
|
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO tier_limits (tier_id, feature_id, limit_value)
|
||||||
|
VALUES (%s, %s, %s)
|
||||||
|
ON CONFLICT (tier_id, feature_id)
|
||||||
|
DO UPDATE SET
|
||||||
|
limit_value = EXCLUDED.limit_value,
|
||||||
|
updated = CURRENT_TIMESTAMP
|
||||||
|
""", (tier_id, feature_id, limit_value))
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return {"ok": True, "updated": len(updates)}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("")
|
||||||
|
def delete_tier_limit(tier_id: str, feature_id: str, session: dict = Depends(require_admin)):
|
||||||
|
"""
|
||||||
|
Admin: Delete tier limit (falls back to feature default).
|
||||||
|
|
||||||
|
Query params: ?tier_id=...&feature_id=...
|
||||||
|
"""
|
||||||
|
if not tier_id or not feature_id:
|
||||||
|
raise HTTPException(400, "tier_id und feature_id fehlen")
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
DELETE FROM tier_limits
|
||||||
|
WHERE tier_id = %s AND feature_id = %s
|
||||||
|
""", (tier_id, feature_id))
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return {"ok": True}
|
||||||
117
backend/routers/tiers_mgmt.py
Normal file
117
backend/routers/tiers_mgmt.py
Normal file
|
|
@ -0,0 +1,117 @@
|
||||||
|
"""
|
||||||
|
Tier Management Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
Admin-only CRUD for subscription tiers.
|
||||||
|
"""
|
||||||
|
from fastapi import APIRouter, HTTPException, Depends
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_admin
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/tiers", tags=["tiers"])
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def list_tiers(session: dict = Depends(require_admin)):
|
||||||
|
"""Admin: List all tiers."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT * FROM tiers
|
||||||
|
ORDER BY sort_order
|
||||||
|
""")
|
||||||
|
return [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("")
|
||||||
|
def create_tier(data: dict, session: dict = Depends(require_admin)):
|
||||||
|
"""
|
||||||
|
Admin: Create new tier.
|
||||||
|
|
||||||
|
Required fields:
|
||||||
|
- id: Tier ID (e.g., 'enterprise')
|
||||||
|
- name: Display name
|
||||||
|
- price_monthly_cents, price_yearly_cents: Prices (NULL for free tiers)
|
||||||
|
"""
|
||||||
|
tier_id = data.get('id', '').strip()
|
||||||
|
name = data.get('name', '').strip()
|
||||||
|
description = data.get('description', '')
|
||||||
|
price_monthly_cents = data.get('price_monthly_cents')
|
||||||
|
price_yearly_cents = data.get('price_yearly_cents')
|
||||||
|
sort_order = data.get('sort_order', 99)
|
||||||
|
|
||||||
|
if not tier_id or not name:
|
||||||
|
raise HTTPException(400, "ID und Name fehlen")
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Check if ID already exists
|
||||||
|
cur.execute("SELECT id FROM tiers WHERE id = %s", (tier_id,))
|
||||||
|
if cur.fetchone():
|
||||||
|
raise HTTPException(400, f"Tier '{tier_id}' existiert bereits")
|
||||||
|
|
||||||
|
# Create tier
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO tiers (
|
||||||
|
id, name, description, price_monthly_cents, price_yearly_cents, sort_order
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s)
|
||||||
|
""", (tier_id, name, description, price_monthly_cents, price_yearly_cents, sort_order))
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return {"ok": True, "id": tier_id}
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{tier_id}")
|
||||||
|
def update_tier(tier_id: str, data: dict, session: dict = Depends(require_admin)):
|
||||||
|
"""Admin: Update tier."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
updates = []
|
||||||
|
values = []
|
||||||
|
|
||||||
|
if 'name' in data:
|
||||||
|
updates.append('name = %s')
|
||||||
|
values.append(data['name'])
|
||||||
|
if 'description' in data:
|
||||||
|
updates.append('description = %s')
|
||||||
|
values.append(data['description'])
|
||||||
|
if 'price_monthly_cents' in data:
|
||||||
|
updates.append('price_monthly_cents = %s')
|
||||||
|
values.append(data['price_monthly_cents'])
|
||||||
|
if 'price_yearly_cents' in data:
|
||||||
|
updates.append('price_yearly_cents = %s')
|
||||||
|
values.append(data['price_yearly_cents'])
|
||||||
|
if 'active' in data:
|
||||||
|
updates.append('active = %s')
|
||||||
|
values.append(data['active'])
|
||||||
|
if 'sort_order' in data:
|
||||||
|
updates.append('sort_order = %s')
|
||||||
|
values.append(data['sort_order'])
|
||||||
|
|
||||||
|
if not updates:
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
updates.append('updated = CURRENT_TIMESTAMP')
|
||||||
|
values.append(tier_id)
|
||||||
|
|
||||||
|
cur.execute(
|
||||||
|
f"UPDATE tiers SET {', '.join(updates)} WHERE id = %s",
|
||||||
|
values
|
||||||
|
)
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{tier_id}")
|
||||||
|
def delete_tier(tier_id: str, session: dict = Depends(require_admin)):
|
||||||
|
"""Admin: Delete tier (soft-delete: set active=false)."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("UPDATE tiers SET active = false WHERE id = %s", (tier_id,))
|
||||||
|
conn.commit()
|
||||||
|
return {"ok": True}
|
||||||
129
backend/routers/training_types.py
Normal file
129
backend/routers/training_types.py
Normal file
|
|
@ -0,0 +1,129 @@
|
||||||
|
"""
|
||||||
|
Training Types API - v9d
|
||||||
|
|
||||||
|
Provides hierarchical list of training categories and subcategories
|
||||||
|
for activity classification.
|
||||||
|
"""
|
||||||
|
from fastapi import APIRouter, Depends
|
||||||
|
from db import get_db, get_cursor
|
||||||
|
from auth import require_auth
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/training-types", tags=["training-types"])
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def list_training_types(session: dict = Depends(require_auth)):
|
||||||
|
"""
|
||||||
|
Get all training types, grouped by category.
|
||||||
|
|
||||||
|
Returns hierarchical structure:
|
||||||
|
{
|
||||||
|
"cardio": [
|
||||||
|
{"id": 1, "subcategory": "running", "name_de": "Laufen", ...},
|
||||||
|
...
|
||||||
|
],
|
||||||
|
"strength": [...],
|
||||||
|
...
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id, category, subcategory, name_de, name_en, icon, sort_order
|
||||||
|
FROM training_types
|
||||||
|
ORDER BY sort_order, category, subcategory
|
||||||
|
""")
|
||||||
|
rows = cur.fetchall()
|
||||||
|
|
||||||
|
# Group by category
|
||||||
|
grouped = {}
|
||||||
|
for row in rows:
|
||||||
|
cat = row['category']
|
||||||
|
if cat not in grouped:
|
||||||
|
grouped[cat] = []
|
||||||
|
grouped[cat].append({
|
||||||
|
'id': row['id'],
|
||||||
|
'category': row['category'],
|
||||||
|
'subcategory': row['subcategory'],
|
||||||
|
'name_de': row['name_de'],
|
||||||
|
'name_en': row['name_en'],
|
||||||
|
'icon': row['icon'],
|
||||||
|
'sort_order': row['sort_order']
|
||||||
|
})
|
||||||
|
|
||||||
|
return grouped
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/flat")
|
||||||
|
def list_training_types_flat(session: dict = Depends(require_auth)):
|
||||||
|
"""
|
||||||
|
Get all training types as flat list (for simple dropdown).
|
||||||
|
"""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id, category, subcategory, name_de, name_en, icon
|
||||||
|
FROM training_types
|
||||||
|
ORDER BY sort_order
|
||||||
|
""")
|
||||||
|
rows = cur.fetchall()
|
||||||
|
|
||||||
|
return [dict(row) for row in rows]
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/categories")
|
||||||
|
def list_categories(session: dict = Depends(require_auth)):
|
||||||
|
"""
|
||||||
|
Get list of unique categories with metadata.
|
||||||
|
"""
|
||||||
|
categories = {
|
||||||
|
'cardio': {
|
||||||
|
'name_de': 'Cardio (Ausdauer)',
|
||||||
|
'name_en': 'Cardio (Endurance)',
|
||||||
|
'icon': '❤️',
|
||||||
|
'color': '#EF4444'
|
||||||
|
},
|
||||||
|
'strength': {
|
||||||
|
'name_de': 'Kraft',
|
||||||
|
'name_en': 'Strength',
|
||||||
|
'icon': '💪',
|
||||||
|
'color': '#3B82F6'
|
||||||
|
},
|
||||||
|
'hiit': {
|
||||||
|
'name_de': 'Schnellkraft / HIIT',
|
||||||
|
'name_en': 'Power / HIIT',
|
||||||
|
'icon': '🔥',
|
||||||
|
'color': '#F59E0B'
|
||||||
|
},
|
||||||
|
'martial_arts': {
|
||||||
|
'name_de': 'Kampfsport',
|
||||||
|
'name_en': 'Martial Arts',
|
||||||
|
'icon': '🥋',
|
||||||
|
'color': '#8B5CF6'
|
||||||
|
},
|
||||||
|
'mobility': {
|
||||||
|
'name_de': 'Mobility & Dehnung',
|
||||||
|
'name_en': 'Mobility & Stretching',
|
||||||
|
'icon': '🧘',
|
||||||
|
'color': '#10B981'
|
||||||
|
},
|
||||||
|
'recovery': {
|
||||||
|
'name_de': 'Erholung (aktiv)',
|
||||||
|
'name_en': 'Recovery (active)',
|
||||||
|
'icon': '💆',
|
||||||
|
'color': '#6B7280'
|
||||||
|
},
|
||||||
|
'mind': {
|
||||||
|
'name_de': 'Geist & Meditation',
|
||||||
|
'name_en': 'Mind & Meditation',
|
||||||
|
'icon': '🧘♂️',
|
||||||
|
'color': '#A78BFA'
|
||||||
|
},
|
||||||
|
'other': {
|
||||||
|
'name_de': 'Sonstiges',
|
||||||
|
'name_en': 'Other',
|
||||||
|
'icon': '📝',
|
||||||
|
'color': '#9CA3AF'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return categories
|
||||||
140
backend/routers/user_restrictions.py
Normal file
140
backend/routers/user_restrictions.py
Normal file
|
|
@ -0,0 +1,140 @@
|
||||||
|
"""
|
||||||
|
User Restrictions Management Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
Admin-only user-specific feature overrides.
|
||||||
|
"""
|
||||||
|
from fastapi import APIRouter, HTTPException, Depends
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_admin
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/user-restrictions", tags=["user-restrictions"])
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def list_user_restrictions(profile_id: str = None, session: dict = Depends(require_admin)):
|
||||||
|
"""
|
||||||
|
Admin: List user restrictions.
|
||||||
|
|
||||||
|
Optional query param: ?profile_id=... (filter by user)
|
||||||
|
"""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
if profile_id:
|
||||||
|
cur.execute("""
|
||||||
|
SELECT
|
||||||
|
ur.*,
|
||||||
|
f.name as feature_name,
|
||||||
|
f.category as feature_category,
|
||||||
|
p.name as profile_name
|
||||||
|
FROM user_feature_restrictions ur
|
||||||
|
JOIN features f ON f.id = ur.feature_id
|
||||||
|
JOIN profiles p ON p.id = ur.profile_id
|
||||||
|
WHERE ur.profile_id = %s
|
||||||
|
ORDER BY f.category, f.name
|
||||||
|
""", (profile_id,))
|
||||||
|
else:
|
||||||
|
cur.execute("""
|
||||||
|
SELECT
|
||||||
|
ur.*,
|
||||||
|
f.name as feature_name,
|
||||||
|
f.category as feature_category,
|
||||||
|
p.name as profile_name,
|
||||||
|
p.email as profile_email
|
||||||
|
FROM user_feature_restrictions ur
|
||||||
|
JOIN features f ON f.id = ur.feature_id
|
||||||
|
JOIN profiles p ON p.id = ur.profile_id
|
||||||
|
ORDER BY p.name, f.category, f.name
|
||||||
|
""")
|
||||||
|
|
||||||
|
return [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("")
|
||||||
|
def create_user_restriction(data: dict, session: dict = Depends(require_admin)):
|
||||||
|
"""
|
||||||
|
Admin: Create user-specific feature restriction.
|
||||||
|
|
||||||
|
Body:
|
||||||
|
{
|
||||||
|
"profile_id": "uuid",
|
||||||
|
"feature_id": "weight_entries",
|
||||||
|
"limit_value": 10, // NULL = unlimited, 0 = disabled
|
||||||
|
"reason": "Spam prevention"
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
profile_id = data.get('profile_id')
|
||||||
|
feature_id = data.get('feature_id')
|
||||||
|
limit_value = data.get('limit_value')
|
||||||
|
reason = data.get('reason', '')
|
||||||
|
|
||||||
|
if not profile_id or not feature_id:
|
||||||
|
raise HTTPException(400, "profile_id und feature_id fehlen")
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Check if restriction already exists
|
||||||
|
cur.execute("""
|
||||||
|
SELECT id FROM user_feature_restrictions
|
||||||
|
WHERE profile_id = %s AND feature_id = %s
|
||||||
|
""", (profile_id, feature_id))
|
||||||
|
|
||||||
|
if cur.fetchone():
|
||||||
|
raise HTTPException(400, "Einschränkung existiert bereits (nutze PUT zum Aktualisieren)")
|
||||||
|
|
||||||
|
# Create restriction
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO user_feature_restrictions (
|
||||||
|
profile_id, feature_id, limit_value, reason, created_by
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, %s)
|
||||||
|
RETURNING id
|
||||||
|
""", (profile_id, feature_id, limit_value, reason, session['profile_id']))
|
||||||
|
|
||||||
|
restriction_id = cur.fetchone()['id']
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return {"ok": True, "id": restriction_id}
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{restriction_id}")
|
||||||
|
def update_user_restriction(restriction_id: str, data: dict, session: dict = Depends(require_admin)):
|
||||||
|
"""Admin: Update user restriction."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
updates = []
|
||||||
|
values = []
|
||||||
|
|
||||||
|
if 'limit_value' in data:
|
||||||
|
updates.append('limit_value = %s')
|
||||||
|
values.append(data['limit_value'])
|
||||||
|
if 'reason' in data:
|
||||||
|
updates.append('reason = %s')
|
||||||
|
values.append(data['reason'])
|
||||||
|
|
||||||
|
if not updates:
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
updates.append('updated = CURRENT_TIMESTAMP')
|
||||||
|
values.append(restriction_id)
|
||||||
|
|
||||||
|
cur.execute(
|
||||||
|
f"UPDATE user_feature_restrictions SET {', '.join(updates)} WHERE id = %s",
|
||||||
|
values
|
||||||
|
)
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{restriction_id}")
|
||||||
|
def delete_user_restriction(restriction_id: str, session: dict = Depends(require_admin)):
|
||||||
|
"""Admin: Delete user restriction (reverts to tier limit)."""
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("DELETE FROM user_feature_restrictions WHERE id = %s", (restriction_id,))
|
||||||
|
conn.commit()
|
||||||
|
return {"ok": True}
|
||||||
684
backend/routers/vitals.py
Normal file
684
backend/routers/vitals.py
Normal file
|
|
@ -0,0 +1,684 @@
|
||||||
|
"""
|
||||||
|
Vitals Router - Resting HR + HRV Tracking
|
||||||
|
v9d Phase 2: Vitals Module
|
||||||
|
|
||||||
|
Endpoints:
|
||||||
|
- GET /api/vitals List vitals (with limit)
|
||||||
|
- GET /api/vitals/by-date/{date} Get vitals for specific date
|
||||||
|
- POST /api/vitals Create/update vitals (upsert)
|
||||||
|
- PUT /api/vitals/{id} Update vitals
|
||||||
|
- DELETE /api/vitals/{id} Delete vitals
|
||||||
|
- GET /api/vitals/stats Get vitals statistics
|
||||||
|
- POST /api/vitals/import/omron Import Omron CSV
|
||||||
|
- POST /api/vitals/import/apple-health Import Apple Health CSV
|
||||||
|
"""
|
||||||
|
from fastapi import APIRouter, HTTPException, Depends, Header, UploadFile, File
|
||||||
|
from pydantic import BaseModel
|
||||||
|
from typing import Optional
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
import logging
|
||||||
|
import csv
|
||||||
|
import io
|
||||||
|
from dateutil import parser as date_parser
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_auth
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/vitals", tags=["vitals"])
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
# German month mapping for Omron dates
|
||||||
|
GERMAN_MONTHS = {
|
||||||
|
'Januar': '01', 'Jan.': '01',
|
||||||
|
'Februar': '02', 'Feb.': '02',
|
||||||
|
'März': '03',
|
||||||
|
'April': '04', 'Apr.': '04',
|
||||||
|
'Mai': '05',
|
||||||
|
'Juni': '06',
|
||||||
|
'Juli': '07',
|
||||||
|
'August': '08', 'Aug.': '08',
|
||||||
|
'September': '09', 'Sep.': '09',
|
||||||
|
'Oktober': '10', 'Okt.': '10',
|
||||||
|
'November': '11', 'Nov.': '11',
|
||||||
|
'Dezember': '12', 'Dez.': '12'
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class VitalsEntry(BaseModel):
|
||||||
|
date: str
|
||||||
|
resting_hr: Optional[int] = None
|
||||||
|
hrv: Optional[int] = None
|
||||||
|
blood_pressure_systolic: Optional[int] = None
|
||||||
|
blood_pressure_diastolic: Optional[int] = None
|
||||||
|
pulse: Optional[int] = None
|
||||||
|
vo2_max: Optional[float] = None
|
||||||
|
spo2: Optional[int] = None
|
||||||
|
respiratory_rate: Optional[float] = None
|
||||||
|
irregular_heartbeat: Optional[bool] = None
|
||||||
|
possible_afib: Optional[bool] = None
|
||||||
|
note: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
|
class VitalsUpdate(BaseModel):
|
||||||
|
date: Optional[str] = None
|
||||||
|
resting_hr: Optional[int] = None
|
||||||
|
hrv: Optional[int] = None
|
||||||
|
blood_pressure_systolic: Optional[int] = None
|
||||||
|
blood_pressure_diastolic: Optional[int] = None
|
||||||
|
pulse: Optional[int] = None
|
||||||
|
vo2_max: Optional[float] = None
|
||||||
|
spo2: Optional[int] = None
|
||||||
|
respiratory_rate: Optional[float] = None
|
||||||
|
irregular_heartbeat: Optional[bool] = None
|
||||||
|
possible_afib: Optional[bool] = None
|
||||||
|
note: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
|
def get_pid(x_profile_id: Optional[str], session: dict) -> str:
|
||||||
|
"""Extract profile_id from session (never from header for security)."""
|
||||||
|
return session['profile_id']
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def list_vitals(
|
||||||
|
limit: int = 90,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Get vitals entries for current profile."""
|
||||||
|
pid = get_pid(x_profile_id, session)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"""
|
||||||
|
SELECT id, profile_id, date, resting_hr, hrv,
|
||||||
|
blood_pressure_systolic, blood_pressure_diastolic, pulse,
|
||||||
|
vo2_max, spo2, respiratory_rate,
|
||||||
|
irregular_heartbeat, possible_afib,
|
||||||
|
note, source, created_at, updated_at
|
||||||
|
FROM vitals_log
|
||||||
|
WHERE profile_id = %s
|
||||||
|
ORDER BY date DESC
|
||||||
|
LIMIT %s
|
||||||
|
""",
|
||||||
|
(pid, limit)
|
||||||
|
)
|
||||||
|
return [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/by-date/{date}")
|
||||||
|
def get_vitals_by_date(
|
||||||
|
date: str,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Get vitals entry for a specific date."""
|
||||||
|
pid = get_pid(x_profile_id, session)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"""
|
||||||
|
SELECT id, profile_id, date, resting_hr, hrv,
|
||||||
|
blood_pressure_systolic, blood_pressure_diastolic, pulse,
|
||||||
|
vo2_max, spo2, respiratory_rate,
|
||||||
|
irregular_heartbeat, possible_afib,
|
||||||
|
note, source, created_at, updated_at
|
||||||
|
FROM vitals_log
|
||||||
|
WHERE profile_id = %s AND date = %s
|
||||||
|
""",
|
||||||
|
(pid, date)
|
||||||
|
)
|
||||||
|
row = cur.fetchone()
|
||||||
|
if not row:
|
||||||
|
raise HTTPException(404, "Keine Vitalwerte für dieses Datum gefunden")
|
||||||
|
return r2d(row)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("")
|
||||||
|
def create_vitals(
|
||||||
|
entry: VitalsEntry,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Create or update vitals entry (upsert).
|
||||||
|
|
||||||
|
Post-Migration-015: Routes to vitals_baseline (for RHR, HRV, etc.)
|
||||||
|
Note: BP measurements should use /api/blood-pressure endpoint instead.
|
||||||
|
"""
|
||||||
|
pid = get_pid(x_profile_id, session)
|
||||||
|
|
||||||
|
# Validation: at least one baseline vital must be provided
|
||||||
|
has_baseline = any([
|
||||||
|
entry.resting_hr, entry.hrv, entry.vo2_max,
|
||||||
|
entry.spo2, entry.respiratory_rate
|
||||||
|
])
|
||||||
|
|
||||||
|
if not has_baseline:
|
||||||
|
raise HTTPException(400, "Mindestens ein Vitalwert muss angegeben werden (RHR, HRV, VO2Max, SpO2, oder Atemfrequenz)")
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Upsert into vitals_baseline (Migration 015)
|
||||||
|
cur.execute(
|
||||||
|
"""
|
||||||
|
INSERT INTO vitals_baseline (
|
||||||
|
profile_id, date, resting_hr, hrv,
|
||||||
|
vo2_max, spo2, respiratory_rate,
|
||||||
|
note, source
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, 'manual')
|
||||||
|
ON CONFLICT (profile_id, date)
|
||||||
|
DO UPDATE SET
|
||||||
|
resting_hr = COALESCE(EXCLUDED.resting_hr, vitals_baseline.resting_hr),
|
||||||
|
hrv = COALESCE(EXCLUDED.hrv, vitals_baseline.hrv),
|
||||||
|
vo2_max = COALESCE(EXCLUDED.vo2_max, vitals_baseline.vo2_max),
|
||||||
|
spo2 = COALESCE(EXCLUDED.spo2, vitals_baseline.spo2),
|
||||||
|
respiratory_rate = COALESCE(EXCLUDED.respiratory_rate, vitals_baseline.respiratory_rate),
|
||||||
|
note = COALESCE(EXCLUDED.note, vitals_baseline.note),
|
||||||
|
updated_at = CURRENT_TIMESTAMP
|
||||||
|
RETURNING id, profile_id, date, resting_hr, hrv,
|
||||||
|
vo2_max, spo2, respiratory_rate,
|
||||||
|
note, source, created_at, updated_at
|
||||||
|
""",
|
||||||
|
(pid, entry.date, entry.resting_hr, entry.hrv,
|
||||||
|
entry.vo2_max, entry.spo2, entry.respiratory_rate,
|
||||||
|
entry.note)
|
||||||
|
)
|
||||||
|
row = cur.fetchone()
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
logger.info(f"[VITALS] Upserted baseline vitals for {pid} on {entry.date}")
|
||||||
|
|
||||||
|
# Return in legacy format for backward compatibility
|
||||||
|
result = r2d(row)
|
||||||
|
result['blood_pressure_systolic'] = None
|
||||||
|
result['blood_pressure_diastolic'] = None
|
||||||
|
result['pulse'] = None
|
||||||
|
result['irregular_heartbeat'] = None
|
||||||
|
result['possible_afib'] = None
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{vitals_id}")
|
||||||
|
def update_vitals(
|
||||||
|
vitals_id: int,
|
||||||
|
updates: VitalsUpdate,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Update existing vitals entry."""
|
||||||
|
pid = get_pid(x_profile_id, session)
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Check ownership
|
||||||
|
cur.execute(
|
||||||
|
"SELECT id FROM vitals_log WHERE id = %s AND profile_id = %s",
|
||||||
|
(vitals_id, pid)
|
||||||
|
)
|
||||||
|
if not cur.fetchone():
|
||||||
|
raise HTTPException(404, "Eintrag nicht gefunden")
|
||||||
|
|
||||||
|
# Build update query dynamically
|
||||||
|
fields = []
|
||||||
|
values = []
|
||||||
|
|
||||||
|
if updates.date is not None:
|
||||||
|
fields.append("date = %s")
|
||||||
|
values.append(updates.date)
|
||||||
|
if updates.resting_hr is not None:
|
||||||
|
fields.append("resting_hr = %s")
|
||||||
|
values.append(updates.resting_hr)
|
||||||
|
if updates.hrv is not None:
|
||||||
|
fields.append("hrv = %s")
|
||||||
|
values.append(updates.hrv)
|
||||||
|
if updates.blood_pressure_systolic is not None:
|
||||||
|
fields.append("blood_pressure_systolic = %s")
|
||||||
|
values.append(updates.blood_pressure_systolic)
|
||||||
|
if updates.blood_pressure_diastolic is not None:
|
||||||
|
fields.append("blood_pressure_diastolic = %s")
|
||||||
|
values.append(updates.blood_pressure_diastolic)
|
||||||
|
if updates.pulse is not None:
|
||||||
|
fields.append("pulse = %s")
|
||||||
|
values.append(updates.pulse)
|
||||||
|
if updates.vo2_max is not None:
|
||||||
|
fields.append("vo2_max = %s")
|
||||||
|
values.append(updates.vo2_max)
|
||||||
|
if updates.spo2 is not None:
|
||||||
|
fields.append("spo2 = %s")
|
||||||
|
values.append(updates.spo2)
|
||||||
|
if updates.respiratory_rate is not None:
|
||||||
|
fields.append("respiratory_rate = %s")
|
||||||
|
values.append(updates.respiratory_rate)
|
||||||
|
if updates.irregular_heartbeat is not None:
|
||||||
|
fields.append("irregular_heartbeat = %s")
|
||||||
|
values.append(updates.irregular_heartbeat)
|
||||||
|
if updates.possible_afib is not None:
|
||||||
|
fields.append("possible_afib = %s")
|
||||||
|
values.append(updates.possible_afib)
|
||||||
|
if updates.note is not None:
|
||||||
|
fields.append("note = %s")
|
||||||
|
values.append(updates.note)
|
||||||
|
|
||||||
|
if not fields:
|
||||||
|
raise HTTPException(400, "Keine Änderungen angegeben")
|
||||||
|
|
||||||
|
fields.append("updated_at = CURRENT_TIMESTAMP")
|
||||||
|
values.append(vitals_id)
|
||||||
|
|
||||||
|
query = f"""
|
||||||
|
UPDATE vitals_log
|
||||||
|
SET {', '.join(fields)}
|
||||||
|
WHERE id = %s
|
||||||
|
RETURNING id, profile_id, date, resting_hr, hrv,
|
||||||
|
blood_pressure_systolic, blood_pressure_diastolic, pulse,
|
||||||
|
vo2_max, spo2, respiratory_rate,
|
||||||
|
irregular_heartbeat, possible_afib,
|
||||||
|
note, source, created_at, updated_at
|
||||||
|
"""
|
||||||
|
|
||||||
|
cur.execute(query, values)
|
||||||
|
row = cur.fetchone()
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return r2d(row)
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{vitals_id}")
|
||||||
|
def delete_vitals(
|
||||||
|
vitals_id: int,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Delete vitals entry."""
|
||||||
|
pid = get_pid(x_profile_id, session)
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Check ownership and delete
|
||||||
|
cur.execute(
|
||||||
|
"DELETE FROM vitals_log WHERE id = %s AND profile_id = %s RETURNING id",
|
||||||
|
(vitals_id, pid)
|
||||||
|
)
|
||||||
|
if not cur.fetchone():
|
||||||
|
raise HTTPException(404, "Eintrag nicht gefunden")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
logger.info(f"[VITALS] Deleted vitals {vitals_id} for {pid}")
|
||||||
|
return {"message": "Eintrag gelöscht"}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/stats")
|
||||||
|
def get_vitals_stats(
|
||||||
|
days: int = 30,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Get vitals statistics over the last N days.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
- avg_resting_hr (7d and 30d)
|
||||||
|
- avg_hrv (7d and 30d)
|
||||||
|
- trend (increasing/decreasing/stable)
|
||||||
|
- latest values
|
||||||
|
"""
|
||||||
|
pid = get_pid(x_profile_id, session)
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Get latest entry
|
||||||
|
cur.execute(
|
||||||
|
"""
|
||||||
|
SELECT date, resting_hr, hrv
|
||||||
|
FROM vitals_log
|
||||||
|
WHERE profile_id = %s AND date >= CURRENT_DATE - INTERVAL '%s days'
|
||||||
|
ORDER BY date DESC
|
||||||
|
LIMIT 1
|
||||||
|
""",
|
||||||
|
(pid, days)
|
||||||
|
)
|
||||||
|
latest = cur.fetchone()
|
||||||
|
|
||||||
|
# Get averages (7d and 30d)
|
||||||
|
cur.execute(
|
||||||
|
"""
|
||||||
|
SELECT
|
||||||
|
AVG(CASE WHEN date >= CURRENT_DATE - INTERVAL '7 days' THEN resting_hr END) as avg_hr_7d,
|
||||||
|
AVG(CASE WHEN date >= CURRENT_DATE - INTERVAL '30 days' THEN resting_hr END) as avg_hr_30d,
|
||||||
|
AVG(CASE WHEN date >= CURRENT_DATE - INTERVAL '7 days' THEN hrv END) as avg_hrv_7d,
|
||||||
|
AVG(CASE WHEN date >= CURRENT_DATE - INTERVAL '30 days' THEN hrv END) as avg_hrv_30d,
|
||||||
|
AVG(CASE WHEN date >= CURRENT_DATE - INTERVAL '7 days' THEN blood_pressure_systolic END) as avg_bp_sys_7d,
|
||||||
|
AVG(CASE WHEN date >= CURRENT_DATE - INTERVAL '30 days' THEN blood_pressure_systolic END) as avg_bp_sys_30d,
|
||||||
|
AVG(CASE WHEN date >= CURRENT_DATE - INTERVAL '7 days' THEN blood_pressure_diastolic END) as avg_bp_dia_7d,
|
||||||
|
AVG(CASE WHEN date >= CURRENT_DATE - INTERVAL '30 days' THEN blood_pressure_diastolic END) as avg_bp_dia_30d,
|
||||||
|
AVG(CASE WHEN date >= CURRENT_DATE - INTERVAL '7 days' THEN spo2 END) as avg_spo2_7d,
|
||||||
|
AVG(CASE WHEN date >= CURRENT_DATE - INTERVAL '30 days' THEN spo2 END) as avg_spo2_30d,
|
||||||
|
COUNT(*) as total_entries
|
||||||
|
FROM vitals_log
|
||||||
|
WHERE profile_id = %s AND date >= CURRENT_DATE - INTERVAL '%s days'
|
||||||
|
""",
|
||||||
|
(pid, max(days, 30))
|
||||||
|
)
|
||||||
|
stats_row = cur.fetchone()
|
||||||
|
|
||||||
|
# Get latest VO2 Max
|
||||||
|
cur.execute(
|
||||||
|
"""
|
||||||
|
SELECT vo2_max
|
||||||
|
FROM vitals_log
|
||||||
|
WHERE profile_id = %s AND vo2_max IS NOT NULL
|
||||||
|
ORDER BY date DESC
|
||||||
|
LIMIT 1
|
||||||
|
""",
|
||||||
|
(pid,)
|
||||||
|
)
|
||||||
|
vo2_row = cur.fetchone()
|
||||||
|
latest_vo2 = vo2_row['vo2_max'] if vo2_row else None
|
||||||
|
|
||||||
|
# Get entries for trend calculation (last 14 days)
|
||||||
|
cur.execute(
|
||||||
|
"""
|
||||||
|
SELECT date, resting_hr, hrv
|
||||||
|
FROM vitals_log
|
||||||
|
WHERE profile_id = %s AND date >= CURRENT_DATE - INTERVAL '14 days'
|
||||||
|
ORDER BY date ASC
|
||||||
|
""",
|
||||||
|
(pid,)
|
||||||
|
)
|
||||||
|
entries = [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
# Simple trend calculation (compare first half vs second half)
|
||||||
|
trend_hr = "stable"
|
||||||
|
trend_hrv = "stable"
|
||||||
|
|
||||||
|
if len(entries) >= 4:
|
||||||
|
mid = len(entries) // 2
|
||||||
|
first_half_hr = [e['resting_hr'] for e in entries[:mid] if e['resting_hr']]
|
||||||
|
second_half_hr = [e['resting_hr'] for e in entries[mid:] if e['resting_hr']]
|
||||||
|
|
||||||
|
if first_half_hr and second_half_hr:
|
||||||
|
avg_first = sum(first_half_hr) / len(first_half_hr)
|
||||||
|
avg_second = sum(second_half_hr) / len(second_half_hr)
|
||||||
|
diff = avg_second - avg_first
|
||||||
|
|
||||||
|
if diff > 2:
|
||||||
|
trend_hr = "increasing"
|
||||||
|
elif diff < -2:
|
||||||
|
trend_hr = "decreasing"
|
||||||
|
|
||||||
|
first_half_hrv = [e['hrv'] for e in entries[:mid] if e['hrv']]
|
||||||
|
second_half_hrv = [e['hrv'] for e in entries[mid:] if e['hrv']]
|
||||||
|
|
||||||
|
if first_half_hrv and second_half_hrv:
|
||||||
|
avg_first_hrv = sum(first_half_hrv) / len(first_half_hrv)
|
||||||
|
avg_second_hrv = sum(second_half_hrv) / len(second_half_hrv)
|
||||||
|
diff_hrv = avg_second_hrv - avg_first_hrv
|
||||||
|
|
||||||
|
if diff_hrv > 5:
|
||||||
|
trend_hrv = "increasing"
|
||||||
|
elif diff_hrv < -5:
|
||||||
|
trend_hrv = "decreasing"
|
||||||
|
|
||||||
|
return {
|
||||||
|
"latest": r2d(latest) if latest else None,
|
||||||
|
"avg_resting_hr_7d": round(stats_row['avg_hr_7d'], 1) if stats_row['avg_hr_7d'] else None,
|
||||||
|
"avg_resting_hr_30d": round(stats_row['avg_hr_30d'], 1) if stats_row['avg_hr_30d'] else None,
|
||||||
|
"avg_hrv_7d": round(stats_row['avg_hrv_7d'], 1) if stats_row['avg_hrv_7d'] else None,
|
||||||
|
"avg_hrv_30d": round(stats_row['avg_hrv_30d'], 1) if stats_row['avg_hrv_30d'] else None,
|
||||||
|
"avg_bp_systolic_7d": round(stats_row['avg_bp_sys_7d'], 1) if stats_row['avg_bp_sys_7d'] else None,
|
||||||
|
"avg_bp_systolic_30d": round(stats_row['avg_bp_sys_30d'], 1) if stats_row['avg_bp_sys_30d'] else None,
|
||||||
|
"avg_bp_diastolic_7d": round(stats_row['avg_bp_dia_7d'], 1) if stats_row['avg_bp_dia_7d'] else None,
|
||||||
|
"avg_bp_diastolic_30d": round(stats_row['avg_bp_dia_30d'], 1) if stats_row['avg_bp_dia_30d'] else None,
|
||||||
|
"avg_spo2_7d": round(stats_row['avg_spo2_7d'], 1) if stats_row['avg_spo2_7d'] else None,
|
||||||
|
"avg_spo2_30d": round(stats_row['avg_spo2_30d'], 1) if stats_row['avg_spo2_30d'] else None,
|
||||||
|
"latest_vo2_max": float(latest_vo2) if latest_vo2 else None,
|
||||||
|
"total_entries": stats_row['total_entries'],
|
||||||
|
"trend_resting_hr": trend_hr,
|
||||||
|
"trend_hrv": trend_hrv,
|
||||||
|
"period_days": days
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
# Import Endpoints
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
|
||||||
|
def parse_omron_date(date_str: str) -> str:
|
||||||
|
"""
|
||||||
|
Parse Omron German date format to YYYY-MM-DD.
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
- "13 März 2026" -> "2026-03-13"
|
||||||
|
- "28 Feb. 2026" -> "2026-02-28"
|
||||||
|
"""
|
||||||
|
parts = date_str.strip().split()
|
||||||
|
if len(parts) != 3:
|
||||||
|
raise ValueError(f"Invalid date format: {date_str}")
|
||||||
|
|
||||||
|
day = parts[0].zfill(2)
|
||||||
|
month_str = parts[1]
|
||||||
|
year = parts[2]
|
||||||
|
|
||||||
|
# Map German month to number
|
||||||
|
month = GERMAN_MONTHS.get(month_str)
|
||||||
|
if not month:
|
||||||
|
raise ValueError(f"Unknown month: {month_str}")
|
||||||
|
|
||||||
|
return f"{year}-{month}-{day}"
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/import/omron")
|
||||||
|
async def import_omron_csv(
|
||||||
|
file: UploadFile = File(...),
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Import Omron blood pressure CSV export.
|
||||||
|
|
||||||
|
Expected format:
|
||||||
|
Datum,Zeit,Systolisch (mmHg),Diastolisch (mmHg),Puls (bpm),...
|
||||||
|
"""
|
||||||
|
pid = get_pid(x_profile_id, session)
|
||||||
|
|
||||||
|
# Read file
|
||||||
|
content = await file.read()
|
||||||
|
content_str = content.decode('utf-8')
|
||||||
|
|
||||||
|
# Parse CSV
|
||||||
|
reader = csv.DictReader(io.StringIO(content_str))
|
||||||
|
|
||||||
|
inserted = 0
|
||||||
|
updated = 0
|
||||||
|
skipped = 0
|
||||||
|
errors = []
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
for row_num, row in enumerate(reader, start=2):
|
||||||
|
try:
|
||||||
|
# Parse date
|
||||||
|
date_str = parse_omron_date(row['Datum'])
|
||||||
|
|
||||||
|
# Parse values
|
||||||
|
systolic = int(row['Systolisch (mmHg)']) if row['Systolisch (mmHg)'] and row['Systolisch (mmHg)'] != '-' else None
|
||||||
|
diastolic = int(row['Diastolisch (mmHg)']) if row['Diastolisch (mmHg)'] and row['Diastolisch (mmHg)'] != '-' else None
|
||||||
|
pulse = int(row['Puls (bpm)']) if row['Puls (bpm)'] and row['Puls (bpm)'] != '-' else None
|
||||||
|
|
||||||
|
# Skip if no data
|
||||||
|
if not systolic and not diastolic and not pulse:
|
||||||
|
skipped += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Parse flags (optional columns)
|
||||||
|
irregular = row.get('Unregelmäßiger Herzschlag festgestellt', '').strip() not in ('', '-', ' ')
|
||||||
|
afib = row.get('Mögliches AFib', '').strip() not in ('', '-', ' ')
|
||||||
|
|
||||||
|
# Upsert
|
||||||
|
cur.execute(
|
||||||
|
"""
|
||||||
|
INSERT INTO vitals_log (
|
||||||
|
profile_id, date, blood_pressure_systolic, blood_pressure_diastolic,
|
||||||
|
pulse, irregular_heartbeat, possible_afib, source
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s, 'omron')
|
||||||
|
ON CONFLICT (profile_id, date)
|
||||||
|
DO UPDATE SET
|
||||||
|
blood_pressure_systolic = COALESCE(EXCLUDED.blood_pressure_systolic, vitals_log.blood_pressure_systolic),
|
||||||
|
blood_pressure_diastolic = COALESCE(EXCLUDED.blood_pressure_diastolic, vitals_log.blood_pressure_diastolic),
|
||||||
|
pulse = COALESCE(EXCLUDED.pulse, vitals_log.pulse),
|
||||||
|
irregular_heartbeat = COALESCE(EXCLUDED.irregular_heartbeat, vitals_log.irregular_heartbeat),
|
||||||
|
possible_afib = COALESCE(EXCLUDED.possible_afib, vitals_log.possible_afib),
|
||||||
|
source = CASE WHEN vitals_log.source = 'manual' THEN vitals_log.source ELSE 'omron' END,
|
||||||
|
updated_at = CURRENT_TIMESTAMP
|
||||||
|
RETURNING (xmax = 0) AS inserted
|
||||||
|
""",
|
||||||
|
(pid, date_str, systolic, diastolic, pulse, irregular, afib)
|
||||||
|
)
|
||||||
|
|
||||||
|
result = cur.fetchone()
|
||||||
|
if result['inserted']:
|
||||||
|
inserted += 1
|
||||||
|
else:
|
||||||
|
updated += 1
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
errors.append(f"Zeile {row_num}: {str(e)}")
|
||||||
|
logger.error(f"[OMRON-IMPORT] Error at row {row_num}: {e}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
logger.info(f"[OMRON-IMPORT] {pid}: {inserted} inserted, {updated} updated, {skipped} skipped, {len(errors)} errors")
|
||||||
|
|
||||||
|
return {
|
||||||
|
"message": "Omron CSV Import abgeschlossen",
|
||||||
|
"inserted": inserted,
|
||||||
|
"updated": updated,
|
||||||
|
"skipped": skipped,
|
||||||
|
"errors": errors[:10] # Limit to first 10 errors
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/import/apple-health")
|
||||||
|
async def import_apple_health_csv(
|
||||||
|
file: UploadFile = File(...),
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Import Apple Health vitals CSV export.
|
||||||
|
|
||||||
|
Expected columns:
|
||||||
|
- Datum/Uhrzeit
|
||||||
|
- Ruhepuls (count/min)
|
||||||
|
- Herzfrequenzvariabilität (ms)
|
||||||
|
- VO2 max (ml/(kg·min))
|
||||||
|
- Blutsauerstoffsättigung (%)
|
||||||
|
- Atemfrequenz (count/min)
|
||||||
|
"""
|
||||||
|
pid = get_pid(x_profile_id, session)
|
||||||
|
|
||||||
|
# Read file
|
||||||
|
content = await file.read()
|
||||||
|
content_str = content.decode('utf-8')
|
||||||
|
|
||||||
|
# Parse CSV
|
||||||
|
reader = csv.DictReader(io.StringIO(content_str))
|
||||||
|
|
||||||
|
inserted = 0
|
||||||
|
updated = 0
|
||||||
|
skipped = 0
|
||||||
|
errors = []
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
for row_num, row in enumerate(reader, start=2):
|
||||||
|
try:
|
||||||
|
# Parse date (format: "2026-02-21 00:00:00")
|
||||||
|
date_str = row.get('Datum/Uhrzeit', '').split()[0] # Extract date part
|
||||||
|
if not date_str:
|
||||||
|
skipped += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Parse values (columns might be empty)
|
||||||
|
resting_hr = None
|
||||||
|
hrv = None
|
||||||
|
vo2_max = None
|
||||||
|
spo2 = None
|
||||||
|
respiratory_rate = None
|
||||||
|
|
||||||
|
if 'Ruhepuls (count/min)' in row and row['Ruhepuls (count/min)']:
|
||||||
|
resting_hr = int(float(row['Ruhepuls (count/min)']))
|
||||||
|
|
||||||
|
if 'Herzfrequenzvariabilität (ms)' in row and row['Herzfrequenzvariabilität (ms)']:
|
||||||
|
hrv = int(float(row['Herzfrequenzvariabilität (ms)']))
|
||||||
|
|
||||||
|
if 'VO2 max (ml/(kg·min))' in row and row['VO2 max (ml/(kg·min))']:
|
||||||
|
vo2_max = float(row['VO2 max (ml/(kg·min))'])
|
||||||
|
|
||||||
|
if 'Blutsauerstoffsättigung (%)' in row and row['Blutsauerstoffsättigung (%)']:
|
||||||
|
spo2 = int(float(row['Blutsauerstoffsättigung (%)']))
|
||||||
|
|
||||||
|
if 'Atemfrequenz (count/min)' in row and row['Atemfrequenz (count/min)']:
|
||||||
|
respiratory_rate = float(row['Atemfrequenz (count/min)'])
|
||||||
|
|
||||||
|
# Skip if no vitals data
|
||||||
|
if not any([resting_hr, hrv, vo2_max, spo2, respiratory_rate]):
|
||||||
|
skipped += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Upsert
|
||||||
|
cur.execute(
|
||||||
|
"""
|
||||||
|
INSERT INTO vitals_log (
|
||||||
|
profile_id, date, resting_hr, hrv, vo2_max, spo2,
|
||||||
|
respiratory_rate, source
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s, 'apple_health')
|
||||||
|
ON CONFLICT (profile_id, date)
|
||||||
|
DO UPDATE SET
|
||||||
|
resting_hr = COALESCE(EXCLUDED.resting_hr, vitals_log.resting_hr),
|
||||||
|
hrv = COALESCE(EXCLUDED.hrv, vitals_log.hrv),
|
||||||
|
vo2_max = COALESCE(EXCLUDED.vo2_max, vitals_log.vo2_max),
|
||||||
|
spo2 = COALESCE(EXCLUDED.spo2, vitals_log.spo2),
|
||||||
|
respiratory_rate = COALESCE(EXCLUDED.respiratory_rate, vitals_log.respiratory_rate),
|
||||||
|
source = CASE WHEN vitals_log.source = 'manual' THEN vitals_log.source ELSE 'apple_health' END,
|
||||||
|
updated_at = CURRENT_TIMESTAMP
|
||||||
|
RETURNING (xmax = 0) AS inserted
|
||||||
|
""",
|
||||||
|
(pid, date_str, resting_hr, hrv, vo2_max, spo2, respiratory_rate)
|
||||||
|
)
|
||||||
|
|
||||||
|
result = cur.fetchone()
|
||||||
|
if result['inserted']:
|
||||||
|
inserted += 1
|
||||||
|
else:
|
||||||
|
updated += 1
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
errors.append(f"Zeile {row_num}: {str(e)}")
|
||||||
|
logger.error(f"[APPLE-HEALTH-IMPORT] Error at row {row_num}: {e}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
logger.info(f"[APPLE-HEALTH-IMPORT] {pid}: {inserted} inserted, {updated} updated, {skipped} skipped, {len(errors)} errors")
|
||||||
|
|
||||||
|
return {
|
||||||
|
"message": "Apple Health CSV Import abgeschlossen",
|
||||||
|
"inserted": inserted,
|
||||||
|
"updated": updated,
|
||||||
|
"skipped": skipped,
|
||||||
|
"errors": errors[:10] # Limit to first 10 errors
|
||||||
|
}
|
||||||
452
backend/routers/vitals_baseline.py
Normal file
452
backend/routers/vitals_baseline.py
Normal file
|
|
@ -0,0 +1,452 @@
|
||||||
|
"""
|
||||||
|
Vitals Baseline Router - v9d Phase 2d Refactored
|
||||||
|
|
||||||
|
Baseline vitals measured once daily (morning, fasted):
|
||||||
|
- Resting Heart Rate (RHR)
|
||||||
|
- Heart Rate Variability (HRV)
|
||||||
|
- VO2 Max
|
||||||
|
- SpO2 (Blood Oxygen Saturation)
|
||||||
|
- Respiratory Rate
|
||||||
|
|
||||||
|
Endpoints:
|
||||||
|
- GET /api/vitals/baseline List baseline vitals
|
||||||
|
- GET /api/vitals/baseline/by-date/{date} Get entry for specific date
|
||||||
|
- POST /api/vitals/baseline Create/update baseline entry (upsert)
|
||||||
|
- PUT /api/vitals/baseline/{id} Update baseline entry
|
||||||
|
- DELETE /api/vitals/baseline/{id} Delete baseline entry
|
||||||
|
- GET /api/vitals/baseline/stats Statistics and trends
|
||||||
|
- POST /api/vitals/baseline/import/apple-health Import Apple Health CSV
|
||||||
|
"""
|
||||||
|
from fastapi import APIRouter, HTTPException, Depends, Header, UploadFile, File
|
||||||
|
from pydantic import BaseModel
|
||||||
|
from typing import Optional
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
import logging
|
||||||
|
import csv
|
||||||
|
import io
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_auth
|
||||||
|
from routers.profiles import get_pid
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/vitals/baseline", tags=["vitals_baseline"])
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
# Pydantic Models
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
class BaselineEntry(BaseModel):
|
||||||
|
date: str
|
||||||
|
resting_hr: Optional[int] = None
|
||||||
|
hrv: Optional[int] = None
|
||||||
|
vo2_max: Optional[float] = None
|
||||||
|
spo2: Optional[int] = None
|
||||||
|
respiratory_rate: Optional[float] = None
|
||||||
|
body_temperature: Optional[float] = None
|
||||||
|
resting_metabolic_rate: Optional[int] = None
|
||||||
|
note: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
# CRUD Endpoints
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def list_baseline_vitals(
|
||||||
|
limit: int = 90,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Get baseline vitals (last N days)."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT * FROM vitals_baseline
|
||||||
|
WHERE profile_id = %s
|
||||||
|
ORDER BY date DESC
|
||||||
|
LIMIT %s
|
||||||
|
""", (pid, limit))
|
||||||
|
return [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/by-date/{date}")
|
||||||
|
def get_baseline_by_date(
|
||||||
|
date: str,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Get baseline entry for specific date."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT * FROM vitals_baseline
|
||||||
|
WHERE profile_id = %s AND date = %s
|
||||||
|
""", (pid, date))
|
||||||
|
row = cur.fetchone()
|
||||||
|
return r2d(row) if row else None
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("")
|
||||||
|
def create_or_update_baseline(
|
||||||
|
entry: BaselineEntry,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Create or update baseline entry (upsert on date)."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
# Build dynamic INSERT columns, placeholders, UPDATE fields, and values list
|
||||||
|
# All arrays must stay synchronized
|
||||||
|
insert_cols = []
|
||||||
|
insert_placeholders = []
|
||||||
|
update_fields = []
|
||||||
|
param_values = [] # Will contain ALL values including pid and date
|
||||||
|
|
||||||
|
# Always include profile_id and date
|
||||||
|
param_values.append(pid)
|
||||||
|
param_values.append(entry.date)
|
||||||
|
|
||||||
|
if entry.resting_hr is not None:
|
||||||
|
insert_cols.append("resting_hr")
|
||||||
|
insert_placeholders.append("%s")
|
||||||
|
update_fields.append("resting_hr = EXCLUDED.resting_hr")
|
||||||
|
param_values.append(entry.resting_hr)
|
||||||
|
|
||||||
|
if entry.hrv is not None:
|
||||||
|
insert_cols.append("hrv")
|
||||||
|
insert_placeholders.append("%s")
|
||||||
|
update_fields.append("hrv = EXCLUDED.hrv")
|
||||||
|
param_values.append(entry.hrv)
|
||||||
|
|
||||||
|
if entry.vo2_max is not None:
|
||||||
|
insert_cols.append("vo2_max")
|
||||||
|
insert_placeholders.append("%s")
|
||||||
|
update_fields.append("vo2_max = EXCLUDED.vo2_max")
|
||||||
|
param_values.append(entry.vo2_max)
|
||||||
|
|
||||||
|
if entry.spo2 is not None:
|
||||||
|
insert_cols.append("spo2")
|
||||||
|
insert_placeholders.append("%s")
|
||||||
|
update_fields.append("spo2 = EXCLUDED.spo2")
|
||||||
|
param_values.append(entry.spo2)
|
||||||
|
|
||||||
|
if entry.respiratory_rate is not None:
|
||||||
|
insert_cols.append("respiratory_rate")
|
||||||
|
insert_placeholders.append("%s")
|
||||||
|
update_fields.append("respiratory_rate = EXCLUDED.respiratory_rate")
|
||||||
|
param_values.append(entry.respiratory_rate)
|
||||||
|
|
||||||
|
if entry.body_temperature is not None:
|
||||||
|
insert_cols.append("body_temperature")
|
||||||
|
insert_placeholders.append("%s")
|
||||||
|
update_fields.append("body_temperature = EXCLUDED.body_temperature")
|
||||||
|
param_values.append(entry.body_temperature)
|
||||||
|
|
||||||
|
if entry.resting_metabolic_rate is not None:
|
||||||
|
insert_cols.append("resting_metabolic_rate")
|
||||||
|
insert_placeholders.append("%s")
|
||||||
|
update_fields.append("resting_metabolic_rate = EXCLUDED.resting_metabolic_rate")
|
||||||
|
param_values.append(entry.resting_metabolic_rate)
|
||||||
|
|
||||||
|
if entry.note:
|
||||||
|
insert_cols.append("note")
|
||||||
|
insert_placeholders.append("%s")
|
||||||
|
update_fields.append("note = EXCLUDED.note")
|
||||||
|
param_values.append(entry.note)
|
||||||
|
|
||||||
|
# At least one field must be provided
|
||||||
|
if not insert_cols:
|
||||||
|
raise HTTPException(400, "At least one baseline vital must be provided")
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Build complete column list and placeholder list
|
||||||
|
# IMPORTANT: psycopg2 uses %s placeholders, NOT $1/$2/$3
|
||||||
|
all_cols = f"profile_id, date, {', '.join(insert_cols)}"
|
||||||
|
all_placeholders = f"%s, %s, {', '.join(insert_placeholders)}"
|
||||||
|
|
||||||
|
query = f"""
|
||||||
|
INSERT INTO vitals_baseline ({all_cols})
|
||||||
|
VALUES ({all_placeholders})
|
||||||
|
ON CONFLICT (profile_id, date)
|
||||||
|
DO UPDATE SET {', '.join(update_fields)}, updated_at = NOW()
|
||||||
|
RETURNING *
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Debug logging
|
||||||
|
print(f"[DEBUG] Vitals baseline query: {query}")
|
||||||
|
print(f"[DEBUG] Param values ({len(param_values)}): {param_values}")
|
||||||
|
|
||||||
|
cur.execute(query, tuple(param_values))
|
||||||
|
return r2d(cur.fetchone())
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{entry_id}")
|
||||||
|
def update_baseline(
|
||||||
|
entry_id: int,
|
||||||
|
entry: BaselineEntry,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Update existing baseline entry."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
# Build SET clause dynamically
|
||||||
|
updates = []
|
||||||
|
values = []
|
||||||
|
idx = 1
|
||||||
|
|
||||||
|
if entry.resting_hr is not None:
|
||||||
|
updates.append(f"resting_hr = ${idx}")
|
||||||
|
values.append(entry.resting_hr)
|
||||||
|
idx += 1
|
||||||
|
if entry.hrv is not None:
|
||||||
|
updates.append(f"hrv = ${idx}")
|
||||||
|
values.append(entry.hrv)
|
||||||
|
idx += 1
|
||||||
|
if entry.vo2_max is not None:
|
||||||
|
updates.append(f"vo2_max = ${idx}")
|
||||||
|
values.append(entry.vo2_max)
|
||||||
|
idx += 1
|
||||||
|
if entry.spo2 is not None:
|
||||||
|
updates.append(f"spo2 = ${idx}")
|
||||||
|
values.append(entry.spo2)
|
||||||
|
idx += 1
|
||||||
|
if entry.respiratory_rate is not None:
|
||||||
|
updates.append(f"respiratory_rate = ${idx}")
|
||||||
|
values.append(entry.respiratory_rate)
|
||||||
|
idx += 1
|
||||||
|
if entry.note:
|
||||||
|
updates.append(f"note = ${idx}")
|
||||||
|
values.append(entry.note)
|
||||||
|
idx += 1
|
||||||
|
|
||||||
|
if not updates:
|
||||||
|
raise HTTPException(400, "No fields to update")
|
||||||
|
|
||||||
|
updates.append("updated_at = NOW()")
|
||||||
|
values.extend([entry_id, pid])
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
query = f"""
|
||||||
|
UPDATE vitals_baseline
|
||||||
|
SET {', '.join(updates)}
|
||||||
|
WHERE id = ${idx} AND profile_id = ${idx + 1}
|
||||||
|
RETURNING *
|
||||||
|
"""
|
||||||
|
cur.execute(query, values)
|
||||||
|
row = cur.fetchone()
|
||||||
|
if not row:
|
||||||
|
raise HTTPException(404, "Entry not found")
|
||||||
|
return r2d(row)
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{entry_id}")
|
||||||
|
def delete_baseline(
|
||||||
|
entry_id: int,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Delete baseline entry."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
DELETE FROM vitals_baseline
|
||||||
|
WHERE id = %s AND profile_id = %s
|
||||||
|
""", (entry_id, pid))
|
||||||
|
if cur.rowcount == 0:
|
||||||
|
raise HTTPException(404, "Entry not found")
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
# Statistics & Trends
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
@router.get("/stats")
|
||||||
|
def get_baseline_stats(
|
||||||
|
days: int = 30,
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Get baseline vitals statistics and trends."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
cutoff_date = (datetime.now() - timedelta(days=days)).date()
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("""
|
||||||
|
SELECT
|
||||||
|
COUNT(*) as total_entries,
|
||||||
|
-- Resting HR
|
||||||
|
AVG(resting_hr) FILTER (WHERE date >= %s - INTERVAL '7 days') as avg_rhr_7d,
|
||||||
|
AVG(resting_hr) FILTER (WHERE date >= %s - INTERVAL '30 days') as avg_rhr_30d,
|
||||||
|
-- HRV
|
||||||
|
AVG(hrv) FILTER (WHERE date >= %s - INTERVAL '7 days') as avg_hrv_7d,
|
||||||
|
AVG(hrv) FILTER (WHERE date >= %s - INTERVAL '30 days') as avg_hrv_30d,
|
||||||
|
-- Latest values
|
||||||
|
(SELECT vo2_max FROM vitals_baseline WHERE profile_id = %s AND vo2_max IS NOT NULL ORDER BY date DESC LIMIT 1) as latest_vo2_max,
|
||||||
|
AVG(spo2) FILTER (WHERE date >= %s - INTERVAL '7 days') as avg_spo2_7d
|
||||||
|
FROM vitals_baseline
|
||||||
|
WHERE profile_id = %s AND date >= %s
|
||||||
|
""", (cutoff_date, cutoff_date, cutoff_date, cutoff_date, pid, cutoff_date, pid, cutoff_date))
|
||||||
|
|
||||||
|
stats = r2d(cur.fetchone())
|
||||||
|
|
||||||
|
# Calculate trends (7d vs 30d)
|
||||||
|
if stats['avg_rhr_7d'] and stats['avg_rhr_30d']:
|
||||||
|
if stats['avg_rhr_7d'] < stats['avg_rhr_30d'] - 2:
|
||||||
|
stats['trend_rhr'] = 'decreasing' # Good!
|
||||||
|
elif stats['avg_rhr_7d'] > stats['avg_rhr_30d'] + 2:
|
||||||
|
stats['trend_rhr'] = 'increasing' # Warning
|
||||||
|
else:
|
||||||
|
stats['trend_rhr'] = 'stable'
|
||||||
|
else:
|
||||||
|
stats['trend_rhr'] = None
|
||||||
|
|
||||||
|
if stats['avg_hrv_7d'] and stats['avg_hrv_30d']:
|
||||||
|
if stats['avg_hrv_7d'] > stats['avg_hrv_30d'] + 5:
|
||||||
|
stats['trend_hrv'] = 'increasing' # Good!
|
||||||
|
elif stats['avg_hrv_7d'] < stats['avg_hrv_30d'] - 5:
|
||||||
|
stats['trend_hrv'] = 'decreasing' # Warning
|
||||||
|
else:
|
||||||
|
stats['trend_hrv'] = 'stable'
|
||||||
|
else:
|
||||||
|
stats['trend_hrv'] = None
|
||||||
|
|
||||||
|
return stats
|
||||||
|
|
||||||
|
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
# Import: Apple Health CSV
|
||||||
|
# ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
|
||||||
|
def safe_int(value):
|
||||||
|
"""Safely parse string to int, handling decimals."""
|
||||||
|
if not value or value == '':
|
||||||
|
return None
|
||||||
|
try:
|
||||||
|
# If it has a decimal point, parse as float first then round to int
|
||||||
|
if '.' in str(value):
|
||||||
|
return int(float(value))
|
||||||
|
return int(value)
|
||||||
|
except (ValueError, TypeError):
|
||||||
|
return None
|
||||||
|
|
||||||
|
def safe_float(value):
|
||||||
|
"""Safely parse string to float."""
|
||||||
|
if not value or value == '':
|
||||||
|
return None
|
||||||
|
try:
|
||||||
|
return float(value)
|
||||||
|
except (ValueError, TypeError):
|
||||||
|
return None
|
||||||
|
|
||||||
|
@router.post("/import/apple-health")
|
||||||
|
async def import_apple_health_baseline(
|
||||||
|
file: UploadFile = File(...),
|
||||||
|
x_profile_id: Optional[str] = Header(default=None),
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Import baseline vitals from Apple Health CSV export."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
content = await file.read()
|
||||||
|
decoded = content.decode('utf-8')
|
||||||
|
reader = csv.DictReader(io.StringIO(decoded))
|
||||||
|
|
||||||
|
inserted = 0
|
||||||
|
updated = 0
|
||||||
|
skipped = 0
|
||||||
|
errors = 0
|
||||||
|
error_details = [] # Collect error messages
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
|
||||||
|
# Log available columns for debugging
|
||||||
|
first_row = True
|
||||||
|
|
||||||
|
for row in reader:
|
||||||
|
try:
|
||||||
|
if first_row:
|
||||||
|
logger.info(f"CSV Columns: {list(row.keys())}")
|
||||||
|
first_row = False
|
||||||
|
|
||||||
|
# Support both English and German column names
|
||||||
|
date_raw = row.get('Start') or row.get('Datum/Uhrzeit')
|
||||||
|
date = date_raw[:10] if date_raw else None
|
||||||
|
if not date:
|
||||||
|
logger.warning(f"Skipped row (no date): Start='{row.get('Start')}', Datum/Uhrzeit='{row.get('Datum/Uhrzeit')}'")
|
||||||
|
skipped += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Extract baseline vitals (support English + German column names)
|
||||||
|
rhr = row.get('Resting Heart Rate') or row.get('Ruhepuls (count/min)')
|
||||||
|
hrv = row.get('Heart Rate Variability') or row.get('Herzfrequenzvariabilität (ms)')
|
||||||
|
vo2 = row.get('VO2 Max') or row.get('VO2 max (ml/(kg·min))')
|
||||||
|
spo2 = row.get('Oxygen Saturation') or row.get('Blutsauerstoffsättigung (%)')
|
||||||
|
resp_rate = row.get('Respiratory Rate') or row.get('Atemfrequenz (count/min)')
|
||||||
|
|
||||||
|
# Skip if no baseline vitals
|
||||||
|
if not any([rhr, hrv, vo2, spo2, resp_rate]):
|
||||||
|
logger.warning(f"Skipped row {date} (no vitals): RHR={rhr}, HRV={hrv}, VO2={vo2}, SpO2={spo2}, RespRate={resp_rate}")
|
||||||
|
skipped += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Upsert
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO vitals_baseline (
|
||||||
|
profile_id, date,
|
||||||
|
resting_hr, hrv, vo2_max, spo2, respiratory_rate,
|
||||||
|
source
|
||||||
|
) VALUES (%s, %s, %s, %s, %s, %s, %s, 'apple_health')
|
||||||
|
ON CONFLICT (profile_id, date)
|
||||||
|
DO UPDATE SET
|
||||||
|
resting_hr = COALESCE(EXCLUDED.resting_hr, vitals_baseline.resting_hr),
|
||||||
|
hrv = COALESCE(EXCLUDED.hrv, vitals_baseline.hrv),
|
||||||
|
vo2_max = COALESCE(EXCLUDED.vo2_max, vitals_baseline.vo2_max),
|
||||||
|
spo2 = COALESCE(EXCLUDED.spo2, vitals_baseline.spo2),
|
||||||
|
respiratory_rate = COALESCE(EXCLUDED.respiratory_rate, vitals_baseline.respiratory_rate),
|
||||||
|
updated_at = NOW()
|
||||||
|
WHERE vitals_baseline.source != 'manual'
|
||||||
|
RETURNING (xmax = 0) AS inserted
|
||||||
|
""", (
|
||||||
|
pid, date,
|
||||||
|
safe_int(rhr),
|
||||||
|
safe_int(hrv),
|
||||||
|
safe_float(vo2),
|
||||||
|
safe_int(spo2),
|
||||||
|
safe_float(resp_rate)
|
||||||
|
))
|
||||||
|
|
||||||
|
result = cur.fetchone()
|
||||||
|
if result is None:
|
||||||
|
# WHERE clause prevented update (manual entry exists)
|
||||||
|
skipped += 1
|
||||||
|
elif result['inserted']:
|
||||||
|
inserted += 1
|
||||||
|
else:
|
||||||
|
updated += 1
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
import traceback
|
||||||
|
error_msg = f"Row {date if 'date' in locals() else 'unknown'}: {str(e)}"
|
||||||
|
error_details.append(error_msg)
|
||||||
|
logger.error(f"{error_msg}\n{traceback.format_exc()}")
|
||||||
|
errors += 1
|
||||||
|
|
||||||
|
return {
|
||||||
|
"inserted": inserted,
|
||||||
|
"updated": updated,
|
||||||
|
"skipped": skipped,
|
||||||
|
"errors": errors,
|
||||||
|
"error_details": error_details[:10] # Return first 10 errors
|
||||||
|
}
|
||||||
111
backend/routers/weight.py
Normal file
111
backend/routers/weight.py
Normal file
|
|
@ -0,0 +1,111 @@
|
||||||
|
"""
|
||||||
|
Weight Tracking Endpoints for Mitai Jinkendo
|
||||||
|
|
||||||
|
Handles weight log CRUD operations and statistics.
|
||||||
|
"""
|
||||||
|
import uuid
|
||||||
|
import logging
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from fastapi import APIRouter, Header, Depends, HTTPException
|
||||||
|
|
||||||
|
from db import get_db, get_cursor, r2d
|
||||||
|
from auth import require_auth, check_feature_access, increment_feature_usage
|
||||||
|
from models import WeightEntry
|
||||||
|
from routers.profiles import get_pid
|
||||||
|
from feature_logger import log_feature_usage
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/weight", tags=["weight"])
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def list_weight(limit: int=365, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Get weight entries for current profile."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"SELECT * FROM weight_log WHERE profile_id=%s ORDER BY date DESC LIMIT %s", (pid,limit))
|
||||||
|
return [r2d(r) for r in cur.fetchall()]
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("")
|
||||||
|
def upsert_weight(e: WeightEntry, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Create or update weight entry (upsert by date)."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
|
||||||
|
# Phase 4: Check feature access and ENFORCE
|
||||||
|
access = check_feature_access(pid, 'weight_entries')
|
||||||
|
|
||||||
|
# Structured logging (always)
|
||||||
|
log_feature_usage(pid, 'weight_entries', access, 'create')
|
||||||
|
|
||||||
|
# BLOCK if limit exceeded
|
||||||
|
if not access['allowed']:
|
||||||
|
logger.warning(
|
||||||
|
f"[FEATURE-LIMIT] User {pid} blocked: "
|
||||||
|
f"weight_entries {access['reason']} (used: {access['used']}, limit: {access['limit']})"
|
||||||
|
)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=403,
|
||||||
|
detail=f"Limit erreicht: Du hast das Kontingent für Gewichtseinträge überschritten ({access['used']}/{access['limit']}). "
|
||||||
|
f"Bitte kontaktiere den Admin oder warte bis zum nächsten Reset."
|
||||||
|
)
|
||||||
|
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT id FROM weight_log WHERE profile_id=%s AND date=%s", (pid,e.date))
|
||||||
|
ex = cur.fetchone()
|
||||||
|
is_new_entry = not ex
|
||||||
|
|
||||||
|
if ex:
|
||||||
|
# UPDATE existing entry
|
||||||
|
cur.execute("UPDATE weight_log SET weight=%s,note=%s WHERE id=%s", (e.weight,e.note,ex['id']))
|
||||||
|
wid = ex['id']
|
||||||
|
else:
|
||||||
|
# INSERT new entry
|
||||||
|
wid = str(uuid.uuid4())
|
||||||
|
cur.execute("INSERT INTO weight_log (id,profile_id,date,weight,note,created) VALUES (%s,%s,%s,%s,%s,CURRENT_TIMESTAMP)",
|
||||||
|
(wid,pid,e.date,e.weight,e.note))
|
||||||
|
|
||||||
|
# Phase 2: Increment usage counter (only for new entries)
|
||||||
|
increment_feature_usage(pid, 'weight_entries')
|
||||||
|
|
||||||
|
return {"id":wid,"date":e.date,"weight":e.weight}
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{wid}")
|
||||||
|
def update_weight(wid: str, e: WeightEntry, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Update existing weight entry."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("UPDATE weight_log SET date=%s,weight=%s,note=%s WHERE id=%s AND profile_id=%s",
|
||||||
|
(e.date,e.weight,e.note,wid,pid))
|
||||||
|
return {"id":wid}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{wid}")
|
||||||
|
def delete_weight(wid: str, x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Delete weight entry."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("DELETE FROM weight_log WHERE id=%s AND profile_id=%s", (wid,pid))
|
||||||
|
return {"ok":True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/stats")
|
||||||
|
def weight_stats(x_profile_id: Optional[str]=Header(default=None), session: dict=Depends(require_auth)):
|
||||||
|
"""Get weight statistics (last 90 days)."""
|
||||||
|
pid = get_pid(x_profile_id)
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute("SELECT date,weight FROM weight_log WHERE profile_id=%s ORDER BY date DESC LIMIT 90", (pid,))
|
||||||
|
rows = cur.fetchall()
|
||||||
|
if not rows: return {"count":0,"latest":None,"prev":None,"min":None,"max":None,"avg_7d":None}
|
||||||
|
w=[float(r['weight']) for r in rows]
|
||||||
|
return {"count":len(rows),"latest":{"date":rows[0]['date'],"weight":float(rows[0]['weight'])},
|
||||||
|
"prev":{"date":rows[1]['date'],"weight":float(rows[1]['weight'])} if len(rows)>1 else None,
|
||||||
|
"min":min(w),"max":max(w),"avg_7d":round(sum(w[:7])/min(7,len(w)),2)}
|
||||||
427
backend/rule_engine.py
Normal file
427
backend/rule_engine.py
Normal file
|
|
@ -0,0 +1,427 @@
|
||||||
|
"""
|
||||||
|
Training Type Profiles - Rule Engine
|
||||||
|
Flexible rule evaluation system for activity quality assessment.
|
||||||
|
|
||||||
|
Issue: #15
|
||||||
|
Date: 2026-03-23
|
||||||
|
"""
|
||||||
|
from typing import Any, Dict, List, Optional, Callable
|
||||||
|
from datetime import datetime
|
||||||
|
import logging
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class RuleEvaluator:
|
||||||
|
"""
|
||||||
|
Generic rule evaluator for arbitrary parameters and operators.
|
||||||
|
|
||||||
|
Supports flexible rule definitions with various operators:
|
||||||
|
- gte, lte, gt, lt: Comparison operators
|
||||||
|
- eq, neq: Equality operators
|
||||||
|
- between: Range checks
|
||||||
|
- in, not_in: Set membership
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Operator definitions
|
||||||
|
OPERATORS: Dict[str, Callable[[Any, Any], bool]] = {
|
||||||
|
"gte": lambda actual, expected: actual is not None and actual >= expected,
|
||||||
|
"lte": lambda actual, expected: actual is not None and actual <= expected,
|
||||||
|
"gt": lambda actual, expected: actual is not None and actual > expected,
|
||||||
|
"lt": lambda actual, expected: actual is not None and actual < expected,
|
||||||
|
"eq": lambda actual, expected: actual == expected,
|
||||||
|
"neq": lambda actual, expected: actual != expected,
|
||||||
|
"between": lambda actual, expected: actual is not None and expected[0] <= actual <= expected[1],
|
||||||
|
"in": lambda actual, expected: actual in expected,
|
||||||
|
"not_in": lambda actual, expected: actual not in expected,
|
||||||
|
}
|
||||||
|
|
||||||
|
OPERATOR_SYMBOLS = {
|
||||||
|
"gte": "≥",
|
||||||
|
"lte": "≤",
|
||||||
|
"gt": ">",
|
||||||
|
"lt": "<",
|
||||||
|
"eq": "=",
|
||||||
|
"neq": "≠",
|
||||||
|
"between": "⟷",
|
||||||
|
"in": "∈",
|
||||||
|
"not_in": "∉",
|
||||||
|
}
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def evaluate_rule(
|
||||||
|
cls,
|
||||||
|
rule: Dict,
|
||||||
|
activity: Dict,
|
||||||
|
parameters_registry: Dict[str, Dict]
|
||||||
|
) -> Dict:
|
||||||
|
"""
|
||||||
|
Evaluates a single rule against an activity.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
rule: {
|
||||||
|
"parameter": str,
|
||||||
|
"operator": str,
|
||||||
|
"value": Any,
|
||||||
|
"weight": int,
|
||||||
|
"optional": bool,
|
||||||
|
"reason": str
|
||||||
|
}
|
||||||
|
activity: Activity data dictionary
|
||||||
|
parameters_registry: Mapping parameter_key -> config
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
{
|
||||||
|
"passed": bool,
|
||||||
|
"actual_value": Any,
|
||||||
|
"expected_value": Any,
|
||||||
|
"parameter": str,
|
||||||
|
"operator": str,
|
||||||
|
"operator_symbol": str,
|
||||||
|
"reason": str,
|
||||||
|
"weight": int,
|
||||||
|
"skipped": bool (optional),
|
||||||
|
"error": str (optional)
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
param_key = rule.get("parameter")
|
||||||
|
operator = rule.get("operator")
|
||||||
|
expected_value = rule.get("value")
|
||||||
|
weight = rule.get("weight", 1)
|
||||||
|
reason = rule.get("reason", "")
|
||||||
|
optional = rule.get("optional", False)
|
||||||
|
|
||||||
|
# Get parameter configuration
|
||||||
|
param_config = parameters_registry.get(param_key)
|
||||||
|
if not param_config:
|
||||||
|
return {
|
||||||
|
"passed": False,
|
||||||
|
"parameter": param_key,
|
||||||
|
"error": f"Unknown parameter: {param_key}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Extract value from activity
|
||||||
|
source_field = param_config.get("source_field", param_key)
|
||||||
|
actual_value = activity.get(source_field)
|
||||||
|
|
||||||
|
# Optional and not provided? → Pass
|
||||||
|
if optional and actual_value is None:
|
||||||
|
return {
|
||||||
|
"passed": True,
|
||||||
|
"actual_value": None,
|
||||||
|
"expected_value": expected_value,
|
||||||
|
"parameter": param_key,
|
||||||
|
"operator": operator,
|
||||||
|
"operator_symbol": cls.OPERATOR_SYMBOLS.get(operator, operator),
|
||||||
|
"reason": "Optional parameter not provided",
|
||||||
|
"weight": weight,
|
||||||
|
"skipped": True
|
||||||
|
}
|
||||||
|
|
||||||
|
# Required but not provided? → Fail
|
||||||
|
if actual_value is None:
|
||||||
|
return {
|
||||||
|
"passed": False,
|
||||||
|
"actual_value": None,
|
||||||
|
"expected_value": expected_value,
|
||||||
|
"parameter": param_key,
|
||||||
|
"operator": operator,
|
||||||
|
"operator_symbol": cls.OPERATOR_SYMBOLS.get(operator, operator),
|
||||||
|
"reason": reason or "Required parameter missing",
|
||||||
|
"weight": weight
|
||||||
|
}
|
||||||
|
|
||||||
|
# Apply operator
|
||||||
|
operator_func = cls.OPERATORS.get(operator)
|
||||||
|
if not operator_func:
|
||||||
|
return {
|
||||||
|
"passed": False,
|
||||||
|
"parameter": param_key,
|
||||||
|
"error": f"Unknown operator: {operator}"
|
||||||
|
}
|
||||||
|
|
||||||
|
try:
|
||||||
|
passed = operator_func(actual_value, expected_value)
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"[RULE-ENGINE] Error evaluating rule {param_key}: {e}")
|
||||||
|
return {
|
||||||
|
"passed": False,
|
||||||
|
"parameter": param_key,
|
||||||
|
"error": f"Evaluation error: {str(e)}"
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
"passed": passed,
|
||||||
|
"actual_value": actual_value,
|
||||||
|
"expected_value": expected_value,
|
||||||
|
"parameter": param_key,
|
||||||
|
"operator": operator,
|
||||||
|
"operator_symbol": cls.OPERATOR_SYMBOLS.get(operator, operator),
|
||||||
|
"reason": reason,
|
||||||
|
"weight": weight
|
||||||
|
}
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def evaluate_rule_set(
|
||||||
|
cls,
|
||||||
|
rule_set: Dict,
|
||||||
|
activity: Dict,
|
||||||
|
parameters_registry: Dict[str, Dict]
|
||||||
|
) -> Dict:
|
||||||
|
"""
|
||||||
|
Evaluates a complete rule set (e.g., minimum_requirements).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
rule_set: {
|
||||||
|
"enabled": bool,
|
||||||
|
"pass_strategy": str,
|
||||||
|
"pass_threshold": float,
|
||||||
|
"rules": [...]
|
||||||
|
}
|
||||||
|
activity: Activity data
|
||||||
|
parameters_registry: Parameter configurations
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
{
|
||||||
|
"enabled": bool,
|
||||||
|
"passed": bool,
|
||||||
|
"score": float (0-1),
|
||||||
|
"rule_results": [...],
|
||||||
|
"pass_strategy": str,
|
||||||
|
"pass_threshold": float,
|
||||||
|
"failed_rules": [...]
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
if not rule_set.get("enabled", False):
|
||||||
|
return {
|
||||||
|
"enabled": False,
|
||||||
|
"passed": True,
|
||||||
|
"score": 1.0,
|
||||||
|
"rule_results": [],
|
||||||
|
"failed_rules": []
|
||||||
|
}
|
||||||
|
|
||||||
|
rules = rule_set.get("rules", [])
|
||||||
|
pass_strategy = rule_set.get("pass_strategy", "weighted_score")
|
||||||
|
pass_threshold = rule_set.get("pass_threshold", 0.6)
|
||||||
|
|
||||||
|
rule_results = []
|
||||||
|
failed_rules = []
|
||||||
|
total_weight = 0
|
||||||
|
passed_weight = 0
|
||||||
|
|
||||||
|
# Evaluate each rule
|
||||||
|
for rule in rules:
|
||||||
|
result = cls.evaluate_rule(rule, activity, parameters_registry)
|
||||||
|
rule_results.append(result)
|
||||||
|
|
||||||
|
if result.get("skipped"):
|
||||||
|
continue
|
||||||
|
|
||||||
|
if result.get("error"):
|
||||||
|
logger.warning(f"[RULE-ENGINE] Rule error: {result['error']}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
weight = result.get("weight", 1)
|
||||||
|
total_weight += weight
|
||||||
|
|
||||||
|
if result["passed"]:
|
||||||
|
passed_weight += weight
|
||||||
|
else:
|
||||||
|
failed_rules.append(result)
|
||||||
|
|
||||||
|
# Calculate score
|
||||||
|
score = passed_weight / total_weight if total_weight > 0 else 1.0
|
||||||
|
|
||||||
|
# Apply pass strategy
|
||||||
|
if pass_strategy == "all_must_pass":
|
||||||
|
passed = all(
|
||||||
|
r["passed"] for r in rule_results
|
||||||
|
if not r.get("skipped") and not r.get("error")
|
||||||
|
)
|
||||||
|
elif pass_strategy == "weighted_score":
|
||||||
|
passed = score >= pass_threshold
|
||||||
|
elif pass_strategy == "at_least_n":
|
||||||
|
n = rule_set.get("at_least_n", 1)
|
||||||
|
passed_count = sum(
|
||||||
|
1 for r in rule_results
|
||||||
|
if r["passed"] and not r.get("skipped")
|
||||||
|
)
|
||||||
|
passed = passed_count >= n
|
||||||
|
else:
|
||||||
|
passed = False
|
||||||
|
logger.warning(f"[RULE-ENGINE] Unknown pass strategy: {pass_strategy}")
|
||||||
|
|
||||||
|
return {
|
||||||
|
"enabled": True,
|
||||||
|
"passed": passed,
|
||||||
|
"score": round(score, 2),
|
||||||
|
"rule_results": rule_results,
|
||||||
|
"failed_rules": failed_rules,
|
||||||
|
"pass_strategy": pass_strategy,
|
||||||
|
"pass_threshold": pass_threshold
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class IntensityZoneEvaluator:
|
||||||
|
"""
|
||||||
|
Evaluates heart rate zones and time distribution.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def evaluate(
|
||||||
|
zone_config: Dict,
|
||||||
|
activity: Dict,
|
||||||
|
user_profile: Dict
|
||||||
|
) -> Dict:
|
||||||
|
"""
|
||||||
|
Evaluates which HR zone the activity was in.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
zone_config: intensity_zones configuration
|
||||||
|
activity: Activity data (with hr_avg)
|
||||||
|
user_profile: User profile (with hf_max)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
{
|
||||||
|
"enabled": bool,
|
||||||
|
"dominant_zone": str,
|
||||||
|
"avg_hr_percent": float,
|
||||||
|
"zone_color": str,
|
||||||
|
"zone_effect": str,
|
||||||
|
"duration_quality": float (0-1),
|
||||||
|
"recommendation": str
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
if not zone_config.get("enabled", False):
|
||||||
|
return {"enabled": False}
|
||||||
|
|
||||||
|
avg_hr = activity.get("hr_avg")
|
||||||
|
user_max_hr = user_profile.get("hf_max", 180) # Default 180 if not set
|
||||||
|
|
||||||
|
if not avg_hr or not user_max_hr:
|
||||||
|
return {
|
||||||
|
"enabled": True,
|
||||||
|
"dominant_zone": "unknown",
|
||||||
|
"avg_hr_percent": None,
|
||||||
|
"recommendation": "Herzfrequenz-Daten fehlen"
|
||||||
|
}
|
||||||
|
|
||||||
|
avg_hr_percent = (avg_hr / user_max_hr) * 100
|
||||||
|
|
||||||
|
# Find matching zone
|
||||||
|
zones = zone_config.get("zones", [])
|
||||||
|
dominant_zone = None
|
||||||
|
|
||||||
|
for zone in zones:
|
||||||
|
zone_rules = zone.get("rules", [])
|
||||||
|
for rule in zone_rules:
|
||||||
|
if rule["parameter"] == "avg_hr_percent":
|
||||||
|
min_percent, max_percent = rule["value"]
|
||||||
|
if min_percent <= avg_hr_percent <= max_percent:
|
||||||
|
dominant_zone = zone
|
||||||
|
break
|
||||||
|
if dominant_zone:
|
||||||
|
break
|
||||||
|
|
||||||
|
if not dominant_zone:
|
||||||
|
return {
|
||||||
|
"enabled": True,
|
||||||
|
"dominant_zone": "out_of_range",
|
||||||
|
"avg_hr_percent": round(avg_hr_percent, 1),
|
||||||
|
"recommendation": "Herzfrequenz außerhalb definierter Zonen"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check duration quality
|
||||||
|
duration = activity.get("duration_min", 0)
|
||||||
|
target_duration = dominant_zone.get("target_duration_min", 30)
|
||||||
|
duration_quality = min(duration / target_duration, 1.0) if target_duration > 0 else 1.0
|
||||||
|
|
||||||
|
recommendation = f"Training in Zone '{dominant_zone['name']}' (Effekt: {dominant_zone['effect']})."
|
||||||
|
if duration < target_duration:
|
||||||
|
recommendation += f" Für optimale Wirkung: {target_duration}min empfohlen."
|
||||||
|
|
||||||
|
return {
|
||||||
|
"enabled": True,
|
||||||
|
"dominant_zone": dominant_zone.get("id"),
|
||||||
|
"dominant_zone_name": dominant_zone.get("name"),
|
||||||
|
"avg_hr_percent": round(avg_hr_percent, 1),
|
||||||
|
"zone_color": dominant_zone.get("color"),
|
||||||
|
"zone_effect": dominant_zone.get("effect"),
|
||||||
|
"duration_quality": round(duration_quality, 2),
|
||||||
|
"target_duration_min": target_duration,
|
||||||
|
"actual_duration_min": duration,
|
||||||
|
"recommendation": recommendation
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class TrainingEffectsEvaluator:
|
||||||
|
"""
|
||||||
|
Evaluates which abilities are trained by the activity.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def evaluate(
|
||||||
|
effects_config: Dict,
|
||||||
|
activity: Dict,
|
||||||
|
intensity_zone_result: Optional[Dict] = None
|
||||||
|
) -> Dict:
|
||||||
|
"""
|
||||||
|
Evaluates training effects (abilities trained).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
effects_config: training_effects configuration
|
||||||
|
activity: Activity data
|
||||||
|
intensity_zone_result: Result from intensity zone evaluation
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
{
|
||||||
|
"enabled": bool,
|
||||||
|
"abilities_trained": [...],
|
||||||
|
"total_training_load": float
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
if not effects_config.get("enabled", False):
|
||||||
|
return {"enabled": False}
|
||||||
|
|
||||||
|
abilities_trained = []
|
||||||
|
|
||||||
|
# Use default effects if no conditional matching
|
||||||
|
default_effects = effects_config.get("default_effects", {})
|
||||||
|
primary_abilities = default_effects.get("primary_abilities", [])
|
||||||
|
secondary_abilities = default_effects.get("secondary_abilities", [])
|
||||||
|
|
||||||
|
# Calculate quality factor (simplified for now)
|
||||||
|
quality_factor = 1.0
|
||||||
|
|
||||||
|
# Primary abilities
|
||||||
|
for ability in primary_abilities:
|
||||||
|
abilities_trained.append({
|
||||||
|
"category": ability["category"],
|
||||||
|
"ability": ability["ability"],
|
||||||
|
"intensity": ability["intensity"],
|
||||||
|
"quality": quality_factor,
|
||||||
|
"contribution": ability["intensity"] * quality_factor,
|
||||||
|
"type": "primary"
|
||||||
|
})
|
||||||
|
|
||||||
|
# Secondary abilities
|
||||||
|
for ability in secondary_abilities:
|
||||||
|
abilities_trained.append({
|
||||||
|
"category": ability["category"],
|
||||||
|
"ability": ability["ability"],
|
||||||
|
"intensity": ability["intensity"],
|
||||||
|
"quality": quality_factor * 0.7, # Secondary = 70%
|
||||||
|
"contribution": ability["intensity"] * quality_factor * 0.7,
|
||||||
|
"type": "secondary"
|
||||||
|
})
|
||||||
|
|
||||||
|
total_training_load = sum(a["contribution"] for a in abilities_trained)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"enabled": True,
|
||||||
|
"abilities_trained": abilities_trained,
|
||||||
|
"total_training_load": round(total_training_load, 2),
|
||||||
|
"metabolic_focus": effects_config.get("metabolic_focus", []),
|
||||||
|
"muscle_groups": effects_config.get("muscle_groups", [])
|
||||||
|
}
|
||||||
116
backend/run_migration_024.py
Normal file
116
backend/run_migration_024.py
Normal file
|
|
@ -0,0 +1,116 @@
|
||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Manual Migration 024 Runner
|
||||||
|
|
||||||
|
Run this to manually execute Migration 024 if it didn't run automatically.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import psycopg2
|
||||||
|
import os
|
||||||
|
from psycopg2.extras import RealDictCursor
|
||||||
|
|
||||||
|
# Database connection
|
||||||
|
DB_HOST = os.getenv('DB_HOST', 'localhost')
|
||||||
|
DB_PORT = os.getenv('DB_PORT', '5432')
|
||||||
|
DB_NAME = os.getenv('DB_NAME', 'bodytrack')
|
||||||
|
DB_USER = os.getenv('DB_USER', 'bodytrack')
|
||||||
|
DB_PASS = os.getenv('DB_PASSWORD', '')
|
||||||
|
|
||||||
|
def main():
|
||||||
|
print("🔧 Manual Migration 024 Runner")
|
||||||
|
print("=" * 60)
|
||||||
|
|
||||||
|
# Connect to database
|
||||||
|
conn = psycopg2.connect(
|
||||||
|
host=DB_HOST,
|
||||||
|
port=DB_PORT,
|
||||||
|
dbname=DB_NAME,
|
||||||
|
user=DB_USER,
|
||||||
|
password=DB_PASS
|
||||||
|
)
|
||||||
|
conn.autocommit = False
|
||||||
|
cur = conn.cursor(cursor_factory=RealDictCursor)
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Check if table exists
|
||||||
|
cur.execute("""
|
||||||
|
SELECT EXISTS (
|
||||||
|
SELECT FROM information_schema.tables
|
||||||
|
WHERE table_name = 'goal_type_definitions'
|
||||||
|
)
|
||||||
|
""")
|
||||||
|
exists = cur.fetchone()['exists']
|
||||||
|
|
||||||
|
if exists:
|
||||||
|
print("✓ goal_type_definitions table already exists")
|
||||||
|
|
||||||
|
# Check if it has data
|
||||||
|
cur.execute("SELECT COUNT(*) as count FROM goal_type_definitions")
|
||||||
|
count = cur.fetchone()['count']
|
||||||
|
print(f"✓ Table has {count} entries")
|
||||||
|
|
||||||
|
if count > 0:
|
||||||
|
print("\n📊 Existing Goal Types:")
|
||||||
|
cur.execute("""
|
||||||
|
SELECT type_key, label_de, unit, is_system, is_active
|
||||||
|
FROM goal_type_definitions
|
||||||
|
ORDER BY is_system DESC, label_de
|
||||||
|
""")
|
||||||
|
for row in cur.fetchall():
|
||||||
|
status = "SYSTEM" if row['is_system'] else "CUSTOM"
|
||||||
|
active = "ACTIVE" if row['is_active'] else "INACTIVE"
|
||||||
|
print(f" - {row['type_key']}: {row['label_de']} ({row['unit']}) [{status}] [{active}]")
|
||||||
|
|
||||||
|
print("\n✅ Migration 024 is already complete!")
|
||||||
|
return
|
||||||
|
|
||||||
|
# Run migration
|
||||||
|
print("\n🚀 Running Migration 024...")
|
||||||
|
|
||||||
|
with open('migrations/024_goal_type_registry.sql', 'r', encoding='utf-8') as f:
|
||||||
|
migration_sql = f.read()
|
||||||
|
|
||||||
|
cur.execute(migration_sql)
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
print("✅ Migration 024 executed successfully!")
|
||||||
|
|
||||||
|
# Verify
|
||||||
|
cur.execute("SELECT COUNT(*) as count FROM goal_type_definitions")
|
||||||
|
count = cur.fetchone()['count']
|
||||||
|
print(f"✓ {count} goal types seeded")
|
||||||
|
|
||||||
|
# Show created types
|
||||||
|
cur.execute("""
|
||||||
|
SELECT type_key, label_de, unit, is_system
|
||||||
|
FROM goal_type_definitions
|
||||||
|
WHERE is_active = true
|
||||||
|
ORDER BY is_system DESC, label_de
|
||||||
|
""")
|
||||||
|
|
||||||
|
print("\n📊 Created Goal Types:")
|
||||||
|
for row in cur.fetchall():
|
||||||
|
status = "SYSTEM" if row['is_system'] else "CUSTOM"
|
||||||
|
print(f" - {row['type_key']}: {row['label_de']} ({row['unit']}) [{status}]")
|
||||||
|
|
||||||
|
# Update schema_migrations
|
||||||
|
cur.execute("""
|
||||||
|
INSERT INTO schema_migrations (filename, executed_at)
|
||||||
|
VALUES ('024_goal_type_registry.sql', NOW())
|
||||||
|
ON CONFLICT (filename) DO NOTHING
|
||||||
|
""")
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
print("\n✅ Migration 024 complete!")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
conn.rollback()
|
||||||
|
print(f"\n❌ Error: {e}")
|
||||||
|
import traceback
|
||||||
|
traceback.print_exc()
|
||||||
|
finally:
|
||||||
|
cur.close()
|
||||||
|
conn.close()
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
||||||
196
docs/DOCUMENTATION_COMPLETE_2026-03-27.md
Normal file
196
docs/DOCUMENTATION_COMPLETE_2026-03-27.md
Normal file
|
|
@ -0,0 +1,196 @@
|
||||||
|
# Dokumentation Abgeschlossen - 27. März 2026
|
||||||
|
|
||||||
|
## ✅ Was wurde dokumentiert?
|
||||||
|
|
||||||
|
### 1. Hauptstatus-Dokument
|
||||||
|
📄 **`docs/STATUS_2026-03-27.md`** (NEU)
|
||||||
|
- Vollständiger aktueller Zustand
|
||||||
|
- Gitea Issues Status (offen/geschlossen)
|
||||||
|
- Nächste Schritte (Testing → Release → Code Splitting → Phase 0b)
|
||||||
|
- Code-Metriken und technische Schulden
|
||||||
|
- Entscheidungspunkte und Risiken
|
||||||
|
- **Wiederanstiegspunkt für zukünftige Sessions**
|
||||||
|
|
||||||
|
### 2. Neue Issue dokumentiert
|
||||||
|
📄 **`docs/issues/issue-52-blood-pressure-dual-targets.md`** (NEU)
|
||||||
|
- Blutdruck-Ziele benötigen zwei Zielfelder (systolisch/diastolisch)
|
||||||
|
- Migration 033 geplant
|
||||||
|
- UI-Anpassungen beschrieben
|
||||||
|
- 2-3h Aufwand geschätzt
|
||||||
|
|
||||||
|
### 3. CLAUDE.md aktualisiert
|
||||||
|
📄 **`CLAUDE.md`**
|
||||||
|
- Version: v0.9g+ → v0.9h
|
||||||
|
- Dynamic Focus Areas v2.0 Sektion hinzugefügt
|
||||||
|
- Bug Fixes dokumentiert
|
||||||
|
- Status: BEREIT FÜR RELEASE v0.9h
|
||||||
|
|
||||||
|
### 4. Roadmap aktualisiert
|
||||||
|
📄 **`.claude/docs/ROADMAP.md`**
|
||||||
|
- Phase 0a: ✅ COMPLETE
|
||||||
|
- Phase 0b: 🎯 NEXT (detaillierter Plan)
|
||||||
|
- Timeline aktualisiert
|
||||||
|
- Phasen-Übersicht neu strukturiert
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📋 Gitea Issues - Aktueller Stand
|
||||||
|
|
||||||
|
### Geprüft ✅
|
||||||
|
- Alle offenen Issues durchgesehen (49, 47, 46, 45, 43, 42, 40, 39, 38, 37, 36, 35, 34, 33, 32, 30, 29, 27, 26, 25)
|
||||||
|
- Geschlossene Issues verifiziert (#50, #51, #48, #44, #28)
|
||||||
|
|
||||||
|
### Manuelle Aktionen erforderlich ⚠️
|
||||||
|
|
||||||
|
Du musst noch in Gitea (http://192.168.2.144:3000/Lars/mitai-jinkendo/issues):
|
||||||
|
|
||||||
|
1. **Issue #25 schließen:**
|
||||||
|
- Titel: "[FEAT] Ziele-System (Goals) - v9e Kernfeature"
|
||||||
|
- Status: ✅ KOMPLETT (Phase 0a + Dynamic Focus Areas v2.0)
|
||||||
|
- Aktion: Manuell auf "Closed" setzen
|
||||||
|
- Kommentar: "Completed in v0.9g-h: Phase 0a + Dynamic Focus Areas v2.0. See issue #50 and #51 for details."
|
||||||
|
|
||||||
|
2. **Issue #52 erstellen:**
|
||||||
|
- Titel: "Enhancement: Blutdruck-Ziele benötigen zwei Zielfelder (systolisch/diastolisch)"
|
||||||
|
- Labels: enhancement, goals, blood-pressure
|
||||||
|
- Priorität: Medium
|
||||||
|
- Beschreibung: Kopiere aus `docs/issues/issue-52-blood-pressure-dual-targets.md`
|
||||||
|
- Aufwand: 2-3h
|
||||||
|
- Milestone: v0.10a (nach Phase 0b)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 Nächste Schritte (wenn du weitermachst)
|
||||||
|
|
||||||
|
### Sofort (nach Deployment-Test):
|
||||||
|
1. **Teste Vitals Baseline Fix**
|
||||||
|
- Ruhepuls eintragen (sollte jetzt funktionieren)
|
||||||
|
- Andere Baseline-Werte testen
|
||||||
|
|
||||||
|
2. **Beginne Goals Testing**
|
||||||
|
- Siehe Checklist in `STATUS_2026-03-27.md`
|
||||||
|
- 2-3 Tage gründliches Testing
|
||||||
|
|
||||||
|
### Dann:
|
||||||
|
3. **Release v0.9h vorbereiten**
|
||||||
|
- Release Notes schreiben
|
||||||
|
- Merge develop → main
|
||||||
|
- Tag v0.9h
|
||||||
|
- Deploy to Production
|
||||||
|
|
||||||
|
4. **Code Splitting durchführen**
|
||||||
|
- goals.py → 5 separate Router
|
||||||
|
- Optional: insights.py prüfen
|
||||||
|
|
||||||
|
5. **Phase 0b starten**
|
||||||
|
- 120+ goal-aware Platzhalter
|
||||||
|
- Score-System
|
||||||
|
- 16-20h Aufwand
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📚 Wichtige Dokumente - Lesereihenfolge
|
||||||
|
|
||||||
|
Wenn du zu diesem Punkt zurückkehrst:
|
||||||
|
|
||||||
|
### 1. Zuerst lesen:
|
||||||
|
- **`docs/STATUS_2026-03-27.md`** ← START HIER
|
||||||
|
- **`CLAUDE.md`** (aktuelle Version)
|
||||||
|
- **`docs/NEXT_STEPS_2026-03-26.md`** (Phase 0b Details)
|
||||||
|
|
||||||
|
### 2. Bei Bedarf:
|
||||||
|
- **`.claude/docs/ROADMAP.md`** (Gesamtübersicht)
|
||||||
|
- **`docs/issues/issue-50-phase-0a-goal-system.md`** (Was wurde gebaut)
|
||||||
|
- **`docs/issues/issue-52-blood-pressure-dual-targets.md`** (Nächstes Enhancement)
|
||||||
|
|
||||||
|
### 3. Funktionale Specs:
|
||||||
|
- **`.claude/docs/functional/AI_PROMPTS.md`** (Prompt-System)
|
||||||
|
- **`.claude/docs/functional/TRAINING_TYPES.md`** (Trainingstypen + Abilities)
|
||||||
|
|
||||||
|
### 4. Technische Specs:
|
||||||
|
- **`.claude/docs/technical/MEMBERSHIP_SYSTEM.md`** (Feature-Enforcement)
|
||||||
|
- **`.claude/docs/architecture/`** (Wenn vorhanden)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🔄 Wiederanstiegspunkt für Claude Code
|
||||||
|
|
||||||
|
### Context Prompt (copy-paste für neue Session):
|
||||||
|
```
|
||||||
|
Wir sind bei v0.9g/h Release-Vorbereitung.
|
||||||
|
|
||||||
|
AKTUELLER STAND:
|
||||||
|
- Phase 0a (Goals System) + Dynamic Focus Areas v2.0: ✅ KOMPLETT
|
||||||
|
- Vitals baseline fix: deployed (needs testing)
|
||||||
|
- Branch: develop (6 commits ahead of main)
|
||||||
|
- Status: BEREIT FÜR RELEASE v0.9h
|
||||||
|
|
||||||
|
NÄCHSTER SCHRITT:
|
||||||
|
- Testing (Goals + Vitals)
|
||||||
|
- Dann: Release v0.9h → Code Splitting → Phase 0b
|
||||||
|
|
||||||
|
LIES ZUERST:
|
||||||
|
- docs/STATUS_2026-03-27.md (vollständiger Zustand)
|
||||||
|
- CLAUDE.md (aktuelle Version)
|
||||||
|
|
||||||
|
FRAGE MICH:
|
||||||
|
"Was ist der aktuelle Schritt?" → Dann sage ich dir Testing/Release/Splitting/Phase 0b
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📊 Zusammenfassung - Was ist fertig?
|
||||||
|
|
||||||
|
### ✅ Komplett implementiert
|
||||||
|
- Goals System (Phase 0a)
|
||||||
|
- Strategic Layer (goal_mode, goals CRUD)
|
||||||
|
- Tactical Layer (CustomGoalsPage)
|
||||||
|
- Training Phases Framework (tables, backend)
|
||||||
|
- Fitness Tests Framework (tables, backend)
|
||||||
|
- Dynamic Focus Areas v2.0
|
||||||
|
- 26 Basis-Bereiche in 7 Kategorien
|
||||||
|
- User-extensible (Admin CRUD UI)
|
||||||
|
- Many-to-Many Goals ↔ Focus Areas
|
||||||
|
- User preferences mit Gewichtungen
|
||||||
|
- Bug Fixes
|
||||||
|
- Focus contributions speichern
|
||||||
|
- Filtering (nur gewichtete Areas)
|
||||||
|
- Vitals baseline endpoint
|
||||||
|
|
||||||
|
### 🔲 Noch zu tun (dokumentiert)
|
||||||
|
- Code Splitting (goals.py → 5 Router)
|
||||||
|
- Phase 0b (120+ Platzhalter, Score-System)
|
||||||
|
- Issue #52 (BP dual targets)
|
||||||
|
- Responsive UI (Issue #30)
|
||||||
|
- Weitere Features (siehe Roadmap)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎉 Dokumentations-Qualität
|
||||||
|
|
||||||
|
**Vollständigkeit:** ⭐⭐⭐⭐⭐
|
||||||
|
- Alle wichtigen Dokumente aktualisiert
|
||||||
|
- Neue Dokumente erstellt
|
||||||
|
- Gitea Issues geprüft
|
||||||
|
- Wiederanstiegspunkt klar definiert
|
||||||
|
|
||||||
|
**Nachvollziehbarkeit:** ⭐⭐⭐⭐⭐
|
||||||
|
- Status-Dokument mit allen Details
|
||||||
|
- Entscheidungen dokumentiert
|
||||||
|
- Nächste Schritte klar beschrieben
|
||||||
|
|
||||||
|
**Wartbarkeit:** ⭐⭐⭐⭐⭐
|
||||||
|
- Strukturierte Dokumentation
|
||||||
|
- Klare Verweise zwischen Dokumenten
|
||||||
|
- Lesereihenfolge definiert
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Erstellt:** 27. März 2026, 23:00 Uhr
|
||||||
|
**Von:** Claude Code (Sonnet 4.5)
|
||||||
|
**Commit:** eb5c099 (docs: comprehensive status update v0.9h pre-release)
|
||||||
|
|
||||||
|
**Du kannst jetzt:**
|
||||||
|
✅ Sicher pausieren
|
||||||
|
✅ Deployment testen
|
||||||
|
✅ Jederzeit exakt an diesem Punkt weitermachen
|
||||||
595
docs/GOALS_SYSTEM_UNIFIED_ANALYSIS.md
Normal file
595
docs/GOALS_SYSTEM_UNIFIED_ANALYSIS.md
Normal file
|
|
@ -0,0 +1,595 @@
|
||||||
|
# Zielesystem: Vereinheitlichte Analyse beider Fachkonzepte
|
||||||
|
|
||||||
|
**Datum:** 26. März 2026
|
||||||
|
**Basis:**
|
||||||
|
- `.claude/docs/functional/GOALS_VITALS.md` (v9e Spec)
|
||||||
|
- `.claude/docs/functional/mitai_jinkendo_konzept_diagramme_auswertungen_v2.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 1. Wichtige Erkenntnis: BEIDE Konzepte sind komplementär!
|
||||||
|
|
||||||
|
### GOALS_VITALS.md definiert:
|
||||||
|
- **Konkrete Zielwerte** (z.B. "82kg bis 30.06.2026")
|
||||||
|
- 8 Zieltypen (Gewicht, KF%, VO2Max, etc.)
|
||||||
|
- Primär-/Nebenziel-Konzept
|
||||||
|
- Trainingsphasen (automatische Erkennung)
|
||||||
|
- Aktive Tests (Cooper, Liegestütze, etc.)
|
||||||
|
- 13 neue KI-Platzhalter
|
||||||
|
|
||||||
|
### Konzept v2 definiert:
|
||||||
|
- **Goal Modes** (strategische Ausrichtung: weight_loss, strength, etc.)
|
||||||
|
- Score-Gewichtung je Goal Mode
|
||||||
|
- Chart-Priorisierung je Goal Mode
|
||||||
|
- Regelbasierte Interpretationen
|
||||||
|
|
||||||
|
### Zusammenspiel:
|
||||||
|
```
|
||||||
|
Goal MODE (v2) → "weight_loss" (strategische Ausrichtung)
|
||||||
|
↓
|
||||||
|
Primary GOAL (v9e) → "82kg bis 30.06.2026" (konkretes Ziel)
|
||||||
|
Secondary GOAL → "16% Körperfett"
|
||||||
|
↓
|
||||||
|
Training PHASE (v9e) → "Kaloriendefizit" (automatisch erkannt)
|
||||||
|
↓
|
||||||
|
Score Weights (v2) → body_progress: 0.30, nutrition: 0.25, ...
|
||||||
|
↓
|
||||||
|
Charts (v2) → Zeigen gewichtete Scores + Fortschritt zu Zielen
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 2. Zwei-Ebenen-Architektur
|
||||||
|
|
||||||
|
### Ebene 1: STRATEGIC (Goal Modes aus v2)
|
||||||
|
**Was:** Grundsätzliche Trainingsausrichtung
|
||||||
|
**Werte:** weight_loss, strength, endurance, recomposition, health
|
||||||
|
**Zweck:** Bestimmt Score-Gewichtung und Interpretations-Kontext
|
||||||
|
**Beispiel:** "Ich will Kraft aufbauen" → mode: strength
|
||||||
|
|
||||||
|
### Ebene 2: TACTICAL (Goal Targets aus v9e)
|
||||||
|
**Was:** Konkrete messbare Ziele
|
||||||
|
**Werte:** "82kg bis 30.06.2026", "VO2Max 55 ml/kg/min", "50 Liegestütze"
|
||||||
|
**Zweck:** Fortschritts-Tracking, Prognosen, Motivation
|
||||||
|
**Beispiel:** "Ich will 82kg wiegen" → target: Gewichtsziel
|
||||||
|
|
||||||
|
### Beide zusammen = Vollständiges Zielesystem
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 3. Überarbeitetes Datenmodell
|
||||||
|
|
||||||
|
### Tabelle: `profiles` (erweitern)
|
||||||
|
```sql
|
||||||
|
-- Strategic Goal Mode (aus v2)
|
||||||
|
ALTER TABLE profiles ADD COLUMN goal_mode VARCHAR(50) DEFAULT 'health';
|
||||||
|
|
||||||
|
COMMENT ON COLUMN profiles.goal_mode IS
|
||||||
|
'Strategic goal mode: weight_loss, strength, endurance, recomposition, health.
|
||||||
|
Determines score weights and interpretation context.';
|
||||||
|
```
|
||||||
|
|
||||||
|
### Tabelle: `goals` (NEU, aus v9e)
|
||||||
|
```sql
|
||||||
|
CREATE TABLE goals (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
|
||||||
|
-- Goal Classification
|
||||||
|
goal_type VARCHAR(50) NOT NULL, -- weight, body_fat, lean_mass, vo2max, strength, flexibility, bp, rhr
|
||||||
|
is_primary BOOLEAN DEFAULT false,
|
||||||
|
status VARCHAR(20) DEFAULT 'active', -- draft, active, reached, abandoned, expired
|
||||||
|
|
||||||
|
-- Target Values
|
||||||
|
target_value DECIMAL(10,2),
|
||||||
|
current_value DECIMAL(10,2),
|
||||||
|
start_value DECIMAL(10,2),
|
||||||
|
unit VARCHAR(20), -- kg, %, ml/kg/min, bpm, mmHg, cm, reps
|
||||||
|
|
||||||
|
-- Timeline
|
||||||
|
start_date DATE DEFAULT CURRENT_DATE,
|
||||||
|
target_date DATE,
|
||||||
|
reached_date DATE,
|
||||||
|
|
||||||
|
-- Metadata
|
||||||
|
name VARCHAR(100), -- z.B. "Sommerfigur 2026"
|
||||||
|
description TEXT,
|
||||||
|
|
||||||
|
-- Progress Tracking
|
||||||
|
progress_pct DECIMAL(5,2), -- Auto-calculated: (current - start) / (target - start) * 100
|
||||||
|
|
||||||
|
-- Timestamps
|
||||||
|
created_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
updated_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
|
||||||
|
-- Constraints
|
||||||
|
CHECK (progress_pct >= 0 AND progress_pct <= 100),
|
||||||
|
CHECK (status IN ('draft', 'active', 'reached', 'abandoned', 'expired'))
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Only one primary goal per profile
|
||||||
|
CREATE UNIQUE INDEX idx_goals_primary ON goals(profile_id, is_primary) WHERE is_primary = true;
|
||||||
|
|
||||||
|
-- Index for active goals lookup
|
||||||
|
CREATE INDEX idx_goals_active ON goals(profile_id, status) WHERE status = 'active';
|
||||||
|
```
|
||||||
|
|
||||||
|
### Tabelle: `training_phases` (NEU, aus v9e)
|
||||||
|
```sql
|
||||||
|
CREATE TABLE training_phases (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
|
||||||
|
-- Phase Type
|
||||||
|
phase_type VARCHAR(50) NOT NULL,
|
||||||
|
-- Werte: calorie_deficit, calorie_maintenance, calorie_surplus,
|
||||||
|
-- conditioning, hiit, max_strength, regeneration, competition_prep
|
||||||
|
|
||||||
|
-- Detection
|
||||||
|
detected_automatically BOOLEAN DEFAULT false,
|
||||||
|
confidence_score DECIMAL(3,2), -- 0.00-1.00
|
||||||
|
|
||||||
|
-- Status
|
||||||
|
status VARCHAR(20) DEFAULT 'suggested', -- suggested, confirmed, active, ended
|
||||||
|
|
||||||
|
-- Timeline
|
||||||
|
start_date DATE,
|
||||||
|
end_date DATE,
|
||||||
|
|
||||||
|
-- Metadata
|
||||||
|
detection_reason TEXT, -- Why was this phase detected?
|
||||||
|
user_notes TEXT,
|
||||||
|
|
||||||
|
-- Timestamps
|
||||||
|
created_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
updated_at TIMESTAMP DEFAULT NOW()
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Only one active phase per profile
|
||||||
|
CREATE UNIQUE INDEX idx_phases_active ON training_phases(profile_id, status) WHERE status = 'active';
|
||||||
|
```
|
||||||
|
|
||||||
|
### Tabelle: `fitness_tests` (NEU, aus v9e)
|
||||||
|
```sql
|
||||||
|
CREATE TABLE fitness_tests (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
|
||||||
|
-- Test Type
|
||||||
|
test_type VARCHAR(50) NOT NULL,
|
||||||
|
-- Standard: cooper, step_test, pushups, squats, sit_reach, balance, grip_strength
|
||||||
|
-- Custom: user_defined
|
||||||
|
|
||||||
|
-- Result
|
||||||
|
result_value DECIMAL(10,2) NOT NULL,
|
||||||
|
result_unit VARCHAR(20) NOT NULL, -- meters, bpm, reps, cm, seconds, kg
|
||||||
|
|
||||||
|
-- Test Date
|
||||||
|
test_date DATE NOT NULL,
|
||||||
|
|
||||||
|
-- Evaluation
|
||||||
|
norm_category VARCHAR(30), -- very_good, good, average, needs_improvement
|
||||||
|
percentile DECIMAL(5,2), -- Where user ranks vs. norm (0-100)
|
||||||
|
|
||||||
|
-- Trend
|
||||||
|
improvement_vs_last DECIMAL(10,2), -- % change from previous test
|
||||||
|
|
||||||
|
-- Metadata
|
||||||
|
notes TEXT,
|
||||||
|
conditions TEXT, -- e.g., "Nach 3h Schlaf, erkältet"
|
||||||
|
|
||||||
|
-- Next Test Recommendation
|
||||||
|
recommended_retest_date DATE,
|
||||||
|
|
||||||
|
created_at TIMESTAMP DEFAULT NOW()
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_fitness_tests_profile_type ON fitness_tests(profile_id, test_type, test_date DESC);
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 4. Vereinheitlichte API-Struktur
|
||||||
|
|
||||||
|
### Goal Modes (Strategic)
|
||||||
|
```python
|
||||||
|
# routers/goals.py
|
||||||
|
|
||||||
|
@router.get("/modes")
|
||||||
|
def get_goal_modes():
|
||||||
|
"""Get all strategic goal modes with score weights."""
|
||||||
|
return GOAL_MODES # From v2 concept
|
||||||
|
|
||||||
|
@router.post("/set-mode")
|
||||||
|
def set_goal_mode(goal_mode: str, session=Depends(require_auth)):
|
||||||
|
"""Set user's strategic goal mode."""
|
||||||
|
# Updates profiles.goal_mode
|
||||||
|
```
|
||||||
|
|
||||||
|
### Goal Targets (Tactical)
|
||||||
|
```python
|
||||||
|
@router.get("/targets")
|
||||||
|
def get_goal_targets(session=Depends(require_auth)):
|
||||||
|
"""Get all active goal targets."""
|
||||||
|
profile_id = session['profile_id']
|
||||||
|
# Returns list from goals table
|
||||||
|
# Includes: primary + all secondary goals
|
||||||
|
|
||||||
|
@router.post("/targets")
|
||||||
|
def create_goal_target(goal: GoalCreate, session=Depends(require_auth)):
|
||||||
|
"""Create a new goal target."""
|
||||||
|
# Inserts into goals table
|
||||||
|
# Auto-calculates progress_pct
|
||||||
|
|
||||||
|
@router.get("/targets/{goal_id}")
|
||||||
|
def get_goal_detail(goal_id: str, session=Depends(require_auth)):
|
||||||
|
"""Get detailed goal info with history."""
|
||||||
|
# Returns goal + progress history + prognosis
|
||||||
|
|
||||||
|
@router.put("/targets/{goal_id}/progress")
|
||||||
|
def update_goal_progress(goal_id: str, session=Depends(require_auth)):
|
||||||
|
"""Recalculate goal progress."""
|
||||||
|
# Auto-called after new measurements
|
||||||
|
# Updates current_value, progress_pct
|
||||||
|
|
||||||
|
@router.post("/targets/{goal_id}/reach")
|
||||||
|
def mark_goal_reached(goal_id: str, session=Depends(require_auth)):
|
||||||
|
"""Mark goal as reached."""
|
||||||
|
# Sets status='reached', reached_date=today
|
||||||
|
```
|
||||||
|
|
||||||
|
### Training Phases
|
||||||
|
```python
|
||||||
|
@router.get("/phases/current")
|
||||||
|
def get_current_phase(session=Depends(require_auth)):
|
||||||
|
"""Get active training phase."""
|
||||||
|
|
||||||
|
@router.get("/phases/detect")
|
||||||
|
def detect_phase(session=Depends(require_auth)):
|
||||||
|
"""Run phase detection algorithm."""
|
||||||
|
# Analyzes last 14 days
|
||||||
|
# Returns suggested phase + confidence + reasoning
|
||||||
|
|
||||||
|
@router.post("/phases/confirm")
|
||||||
|
def confirm_phase(phase_id: str, session=Depends(require_auth)):
|
||||||
|
"""Confirm detected phase."""
|
||||||
|
# Sets status='active'
|
||||||
|
```
|
||||||
|
|
||||||
|
### Fitness Tests
|
||||||
|
```python
|
||||||
|
@router.get("/tests/types")
|
||||||
|
def get_test_types():
|
||||||
|
"""Get all available fitness tests."""
|
||||||
|
|
||||||
|
@router.post("/tests/{test_type}/execute")
|
||||||
|
def record_test_result(
|
||||||
|
test_type: str,
|
||||||
|
result_value: float,
|
||||||
|
result_unit: str,
|
||||||
|
session=Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Record a fitness test result."""
|
||||||
|
# Inserts into fitness_tests
|
||||||
|
# Auto-calculates norm_category, percentile, improvement
|
||||||
|
|
||||||
|
@router.get("/tests/due")
|
||||||
|
def get_due_tests(session=Depends(require_auth)):
|
||||||
|
"""Get tests that are due for retesting."""
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 5. Neue KI-Platzhalter (kombiniert aus beiden Konzepten)
|
||||||
|
|
||||||
|
### Strategic (aus v2)
|
||||||
|
```python
|
||||||
|
{{goal_mode}} # "weight_loss"
|
||||||
|
{{goal_mode_label}} # "Gewichtsreduktion"
|
||||||
|
{{goal_mode_description}} # "Fettabbau bei Erhalt der Magermasse"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Tactical - Primary Goal (aus v9e)
|
||||||
|
```python
|
||||||
|
{{primary_goal_type}} # "weight"
|
||||||
|
{{primary_goal_name}} # "Sommerfigur 2026"
|
||||||
|
{{primary_goal_target}} # "82 kg bis 30.06.2026"
|
||||||
|
{{primary_goal_current}} # "85.2 kg"
|
||||||
|
{{primary_goal_start}} # "86.1 kg"
|
||||||
|
{{primary_goal_progress_pct}} # "72%"
|
||||||
|
{{primary_goal_progress_text}} # "72% erreicht (4 kg von 5,5 kg)"
|
||||||
|
{{primary_goal_days_remaining}} # "45 Tage"
|
||||||
|
{{primary_goal_prognosis}} # "Ziel voraussichtlich in 6 Wochen erreicht (3 Wochen früher!)"
|
||||||
|
{{primary_goal_on_track}} # "true"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Tactical - Secondary Goals (aus v9e)
|
||||||
|
```python
|
||||||
|
{{secondary_goals_count}} # "2"
|
||||||
|
{{secondary_goals_list}} # "16% Körperfett, VO2Max 55 ml/kg/min"
|
||||||
|
{{secondary_goal_1_type}} # "body_fat"
|
||||||
|
{{secondary_goal_1_progress}} # "45%"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Training Phase (aus v9e)
|
||||||
|
```python
|
||||||
|
{{current_phase}} # "calorie_deficit"
|
||||||
|
{{current_phase_label}} # "Kaloriendefizit"
|
||||||
|
{{phase_since}} # "seit 14 Tagen"
|
||||||
|
{{phase_confidence}} # "0.92"
|
||||||
|
{{phase_recommendation}} # "Krafttraining erhalten, Cardio moderat, Proteinzufuhr 2g/kg"
|
||||||
|
{{phase_detected_automatically}} # "true"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Fitness Tests (aus v9e)
|
||||||
|
```python
|
||||||
|
{{test_last_cooper}} # "2.800m (VO2Max ~52) vor 3 Wochen"
|
||||||
|
{{test_last_cooper_date}} # "2026-03-05"
|
||||||
|
{{test_last_cooper_result}} # "2800"
|
||||||
|
{{test_last_cooper_vo2max}} # "52.3"
|
||||||
|
{{test_last_cooper_category}} # "good"
|
||||||
|
{{test_due_list}} # "Sit & Reach (seit 5 Wochen), Liegestütze (seit 4 Wochen)"
|
||||||
|
{{test_next_recommended}} # "Cooper-Test (in 2 Wochen fällig)"
|
||||||
|
{{fitness_score_overall}} # "72/100"
|
||||||
|
{{fitness_score_endurance}} # "good"
|
||||||
|
{{fitness_score_strength}} # "average"
|
||||||
|
{{fitness_score_flexibility}} # "needs_improvement"
|
||||||
|
```
|
||||||
|
|
||||||
|
### GESAMT: 35+ neue Platzhalter aus v9e
|
||||||
|
Plus die 84 aus v2 = **120+ neue Platzhalter total**
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 6. Überarbeitete Implementierungs-Roadmap
|
||||||
|
|
||||||
|
### Phase 0a: Minimal Goal System (3-4h) ⭐ **JETZT**
|
||||||
|
|
||||||
|
**Strategic Layer:**
|
||||||
|
- DB: `goal_mode` in profiles
|
||||||
|
- Backend: GOAL_MODES aus v2
|
||||||
|
- API: GET/SET goal mode
|
||||||
|
- UI: Goal Mode Selector (5 Modi)
|
||||||
|
|
||||||
|
**Tactical Layer:**
|
||||||
|
- DB: `goals` table
|
||||||
|
- API: CRUD für goal targets
|
||||||
|
- UI: Goal Management Page (minimal)
|
||||||
|
- Liste aktiver Ziele
|
||||||
|
- Fortschrittsbalken
|
||||||
|
- "+ Neues Ziel" Button
|
||||||
|
|
||||||
|
**Aufwand:** 3-4h (erweitert wegen Tactical Layer)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase 0b: Goal-Aware Placeholders (16-20h)
|
||||||
|
|
||||||
|
**Strategic Placeholders:**
|
||||||
|
```python
|
||||||
|
{{goal_mode}} # Aus profiles.goal_mode
|
||||||
|
{{goal_mode_label}} # Aus GOAL_MODES mapping
|
||||||
|
```
|
||||||
|
|
||||||
|
**Tactical Placeholders:**
|
||||||
|
```python
|
||||||
|
{{primary_goal_type}} # Aus goals WHERE is_primary=true
|
||||||
|
{{primary_goal_target}}
|
||||||
|
{{primary_goal_progress_pct}}
|
||||||
|
{{primary_goal_prognosis}} # Berechnet aus Trend
|
||||||
|
```
|
||||||
|
|
||||||
|
**Score Calculations (goal-aware):**
|
||||||
|
```python
|
||||||
|
def get_body_progress_score(profile_id: str) -> str:
|
||||||
|
profile = get_profile_data(profile_id)
|
||||||
|
goal_mode = profile.get('goal_mode', 'health')
|
||||||
|
|
||||||
|
# Get weights from v2 concept
|
||||||
|
weights = GOAL_MODES[goal_mode]['score_weights']
|
||||||
|
|
||||||
|
# Calculate sub-scores
|
||||||
|
fm_score = calculate_fm_progress(profile_id)
|
||||||
|
lbm_score = calculate_lbm_progress(profile_id)
|
||||||
|
|
||||||
|
# Weight according to goal mode
|
||||||
|
if goal_mode == 'weight_loss':
|
||||||
|
total = 0.50 * fm_score + 0.30 * weight_score + 0.20 * lbm_score
|
||||||
|
elif goal_mode == 'strength':
|
||||||
|
total = 0.60 * lbm_score + 0.30 * fm_score + 0.10 * weight_score
|
||||||
|
# ...
|
||||||
|
|
||||||
|
return f"{int(total)}/100"
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase 0c: Training Phases (4-6h) **PARALLEL**
|
||||||
|
|
||||||
|
**DB:**
|
||||||
|
- `training_phases` table
|
||||||
|
|
||||||
|
**Detection Algorithm:**
|
||||||
|
```python
|
||||||
|
def detect_current_phase(profile_id: str) -> dict:
|
||||||
|
"""Detects training phase from last 14 days of data."""
|
||||||
|
|
||||||
|
# Analyze data
|
||||||
|
kcal_balance = get_kcal_balance_14d(profile_id)
|
||||||
|
training_dist = get_training_distribution_14d(profile_id)
|
||||||
|
weight_trend = get_weight_trend_14d(profile_id)
|
||||||
|
hrv_avg = get_hrv_avg_14d(profile_id)
|
||||||
|
volume_change = get_volume_change_14d(profile_id)
|
||||||
|
|
||||||
|
# Phase Detection Rules
|
||||||
|
if kcal_balance < -300 and weight_trend < 0:
|
||||||
|
return {
|
||||||
|
'phase': 'calorie_deficit',
|
||||||
|
'confidence': 0.85,
|
||||||
|
'reason': f'Avg kcal balance {kcal_balance}/day, weight -0.5kg/week'
|
||||||
|
}
|
||||||
|
|
||||||
|
if training_dist['endurance'] > 60 and vo2max_trend > 0:
|
||||||
|
return {
|
||||||
|
'phase': 'conditioning',
|
||||||
|
'confidence': 0.78,
|
||||||
|
'reason': f'{training_dist["endurance"]}% cardio, VO2max improving'
|
||||||
|
}
|
||||||
|
|
||||||
|
if volume_change < -40 and hrv_avg < hrv_baseline * 0.85:
|
||||||
|
return {
|
||||||
|
'phase': 'regeneration',
|
||||||
|
'confidence': 0.92,
|
||||||
|
'reason': f'Volume -40%, HRV below baseline, recovery needed'
|
||||||
|
}
|
||||||
|
|
||||||
|
# Default
|
||||||
|
return {
|
||||||
|
'phase': 'maintenance',
|
||||||
|
'confidence': 0.50,
|
||||||
|
'reason': 'No clear pattern detected'
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**API:**
|
||||||
|
- GET /phases/current
|
||||||
|
- GET /phases/detect
|
||||||
|
- POST /phases/confirm
|
||||||
|
|
||||||
|
**UI:**
|
||||||
|
- Dashboard Badge: "📊 Phase: Kaloriendefizit"
|
||||||
|
- Phase Detection Banner: "Wir haben erkannt: Kaloriendefizit-Phase. Stimmt das?"
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase 0d: Fitness Tests (4-6h) **SPÄTER**
|
||||||
|
|
||||||
|
**DB:**
|
||||||
|
- `fitness_tests` table
|
||||||
|
|
||||||
|
**Test Definitions:**
|
||||||
|
```python
|
||||||
|
FITNESS_TESTS = {
|
||||||
|
'cooper': {
|
||||||
|
'name': 'Cooper-Test',
|
||||||
|
'description': '12 Minuten laufen, maximale Distanz',
|
||||||
|
'unit': 'meters',
|
||||||
|
'interval_weeks': 6,
|
||||||
|
'norm_tables': { # Simplified
|
||||||
|
'male_30-39': {'very_good': 2800, 'good': 2500, 'average': 2200},
|
||||||
|
'female_30-39': {'very_good': 2500, 'good': 2200, 'average': 1900}
|
||||||
|
},
|
||||||
|
'calculate_vo2max': lambda distance: (distance - 504.9) / 44.73
|
||||||
|
},
|
||||||
|
'pushups': {
|
||||||
|
'name': 'Liegestütze-Test',
|
||||||
|
'description': 'Maximale Anzahl ohne Pause',
|
||||||
|
'unit': 'reps',
|
||||||
|
'interval_weeks': 4,
|
||||||
|
'norm_tables': { ... }
|
||||||
|
},
|
||||||
|
# ... weitere Tests
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**UI:**
|
||||||
|
- Tests Page mit Testliste
|
||||||
|
- Test Execution Flow (Anleitung → Eingabe → Auswertung)
|
||||||
|
- Test History mit Trend-Chart
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 7. Priorisierte Reihenfolge
|
||||||
|
|
||||||
|
### SOFORT (3-4h)
|
||||||
|
**Phase 0a:** Minimal Goal System (Strategic + Tactical)
|
||||||
|
- Basis für alles andere
|
||||||
|
- User kann Ziele setzen
|
||||||
|
- Score-Berechnungen können goal_mode nutzen
|
||||||
|
|
||||||
|
### DIESE WOCHE (16-20h)
|
||||||
|
**Phase 0b:** Goal-Aware Placeholders
|
||||||
|
- 84 Platzhalter aus v2
|
||||||
|
- 35+ Platzhalter aus v9e
|
||||||
|
- **TOTAL: 120+ Platzhalter**
|
||||||
|
|
||||||
|
### PARALLEL (4-6h)
|
||||||
|
**Phase 0c:** Training Phases
|
||||||
|
- Automatische Erkennung
|
||||||
|
- Phase-aware Recommendations
|
||||||
|
|
||||||
|
### SPÄTER (4-6h)
|
||||||
|
**Phase 0d:** Fitness Tests
|
||||||
|
- Enhancement, nicht kritisch für Charts
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 8. Kritische Erkenntnisse
|
||||||
|
|
||||||
|
### 1. GOALS_VITALS.md ist detaillierter
|
||||||
|
- Konkrete Implementierungs-Specs
|
||||||
|
- DB-Schema-Vorschläge
|
||||||
|
- 13 definierte KI-Platzhalter
|
||||||
|
- **ABER:** Fehlt Score-Gewichtung (das hat v2)
|
||||||
|
|
||||||
|
### 2. Konzept v2 ist strategischer
|
||||||
|
- Goal Modes mit Score-Gewichtung
|
||||||
|
- Chart-Interpretationen
|
||||||
|
- Regelbasierte Logik
|
||||||
|
- **ABER:** Fehlt konkrete Ziel-Tracking (das hat v9e)
|
||||||
|
|
||||||
|
### 3. Beide zusammen = Vollständig
|
||||||
|
- v2 (Goal Modes) + v9e (Goal Targets) = Komplettes Zielesystem
|
||||||
|
- v2 (Scores) + v9e (Tests) = Vollständiges Assessment
|
||||||
|
- v2 (Charts) + v9e (Phases) = Kontext-aware Visualisierung
|
||||||
|
|
||||||
|
### 4. Meine ursprüngliche Analyse war incomplete
|
||||||
|
- Ich hatte nur v2 betrachtet
|
||||||
|
- v9e fügt kritische Details hinzu
|
||||||
|
- **Neue Gesamt-Schätzung:** 120+ Platzhalter (statt 84)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 9. Aktualisierte Empfehlung
|
||||||
|
|
||||||
|
**JA zu Phase 0a (Minimal Goal System), ABER erweitert:**
|
||||||
|
|
||||||
|
### Was Phase 0a umfassen muss (3-4h):
|
||||||
|
|
||||||
|
1. **Strategic Layer (aus v2):**
|
||||||
|
- goal_mode in profiles
|
||||||
|
- GOAL_MODES Definition
|
||||||
|
- GET/SET endpoints
|
||||||
|
|
||||||
|
2. **Tactical Layer (aus v9e):**
|
||||||
|
- goals Tabelle
|
||||||
|
- CRUD für Ziele
|
||||||
|
- Fortschritts-Berechnung
|
||||||
|
|
||||||
|
3. **UI:**
|
||||||
|
- Goal Mode Selector (Settings)
|
||||||
|
- Goal Management Page (Basic)
|
||||||
|
- Dashboard Goal Widget
|
||||||
|
|
||||||
|
### Was kann warten:
|
||||||
|
- Training Phases → Phase 0c (parallel)
|
||||||
|
- Fitness Tests → Phase 0d (später)
|
||||||
|
- Vollständige Test-Integration → v9f
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 10. Nächste Schritte
|
||||||
|
|
||||||
|
**JETZT:**
|
||||||
|
1. Phase 0a implementieren (3-4h)
|
||||||
|
- Strategic + Tactical Goal System
|
||||||
|
2. Dann Phase 0b (Goal-Aware Placeholders, 16-20h)
|
||||||
|
3. Parallel Phase 0c (Training Phases, 4-6h)
|
||||||
|
|
||||||
|
**Soll ich mit Phase 0a (erweitert) starten?**
|
||||||
|
- Beide Goal-Konzepte integriert
|
||||||
|
- Ready für 120+ Platzhalter
|
||||||
|
- Basis für intelligentes Coach-System
|
||||||
|
|
||||||
|
**Commit:** ae93b9d (muss aktualisiert werden)
|
||||||
|
**Neue Analyse:** GOALS_SYSTEM_UNIFIED_ANALYSIS.md
|
||||||
538
docs/GOAL_SYSTEM_PRIORITY_ANALYSIS.md
Normal file
538
docs/GOAL_SYSTEM_PRIORITY_ANALYSIS.md
Normal file
|
|
@ -0,0 +1,538 @@
|
||||||
|
# Zielesystem: Prioritäts-Analyse
|
||||||
|
|
||||||
|
**Datum:** 26. März 2026
|
||||||
|
**Frage:** Zielesystem vor oder nach Platzhaltern/Charts?
|
||||||
|
**Antwort:** **Minimales Zielesystem VOR Platzhaltern, volles System parallel**
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 1. Kritische Erkenntnis aus Fachkonzept
|
||||||
|
|
||||||
|
### Zitat Fachkonzept (Zeile 20-28):
|
||||||
|
> **Wichtig ist, dass das System zielabhängig interpretiert:**
|
||||||
|
> - Gewichtsreduktion
|
||||||
|
> - Muskel-/Kraftaufbau
|
||||||
|
> - Konditions-/Ausdaueraufbau
|
||||||
|
> - Körperrekomposition
|
||||||
|
> - allgemeine Gesundheit
|
||||||
|
>
|
||||||
|
> **Dasselbe Rohsignal kann je nach Ziel anders bewertet werden.**
|
||||||
|
> Ein Kaloriendefizit ist z. B. bei Gewichtsreduktion oft positiv,
|
||||||
|
> bei Kraftaufbau aber potenziell hinderlich.
|
||||||
|
|
||||||
|
### Konsequenz
|
||||||
|
❌ **Charts OHNE Zielesystem = falsche Interpretationen**
|
||||||
|
✅ **Charts MIT Zielesystem = korrekte, zielspezifische Aussagen**
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 2. Abhängigkeits-Matrix
|
||||||
|
|
||||||
|
### Was hängt vom Zielesystem ab?
|
||||||
|
|
||||||
|
| Komponente | Zielabhängig? | Beispiel |
|
||||||
|
|------------|---------------|----------|
|
||||||
|
| **Rohdaten-Charts** | ❌ Nein | Gewichtsverlauf, Umfänge-Trend |
|
||||||
|
| **Score-Gewichtung** | ✅ JA | Body Progress Score: 30% bei weight_loss, 20% bei strength |
|
||||||
|
| **Interpretationen** | ✅ JA | Kaloriendefizit: "gut" bei weight_loss, "kritisch" bei strength |
|
||||||
|
| **Hinweise** | ✅ JA | "Gewicht stagniert" → bei weight_loss: Warnung, bei strength: egal |
|
||||||
|
| **Platzhalter (Berechnungen)** | ⚠️ TEILWEISE | Trends: Nein, Scores: JA |
|
||||||
|
| **KI-Prompts** | ✅ JA | Analyse-Kontext ändert sich komplett |
|
||||||
|
|
||||||
|
### Fachkonzept: Score-Gewichtung (Zeile 185-216)
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
score_weights:
|
||||||
|
weight_loss:
|
||||||
|
body_progress: 0.30 # Körper wichtig
|
||||||
|
nutrition: 0.25
|
||||||
|
activity: 0.20
|
||||||
|
recovery: 0.15
|
||||||
|
health_risk: 0.10
|
||||||
|
|
||||||
|
strength:
|
||||||
|
body_progress: 0.20
|
||||||
|
nutrition: 0.25
|
||||||
|
activity: 0.30 # Training wichtiger
|
||||||
|
recovery: 0.20
|
||||||
|
health_risk: 0.05 # Weniger kritisch
|
||||||
|
|
||||||
|
endurance:
|
||||||
|
body_progress: 0.10 # Körper unwichtiger
|
||||||
|
activity: 0.35 # Training am wichtigsten
|
||||||
|
recovery: 0.25 # Recovery sehr wichtig
|
||||||
|
```
|
||||||
|
|
||||||
|
### Beispiel: Body Progress Score
|
||||||
|
|
||||||
|
**OHNE Zielesystem:**
|
||||||
|
```python
|
||||||
|
def calculate_body_progress_score():
|
||||||
|
# Generisch, für niemanden wirklich passend
|
||||||
|
fm_delta_score = calculate_fm_change() # -5kg
|
||||||
|
lbm_delta_score = calculate_lbm_change() # -2kg
|
||||||
|
return (fm_delta_score + lbm_delta_score) / 2
|
||||||
|
# Score: 50/100 (FM gut runter, aber LBM auch runter)
|
||||||
|
```
|
||||||
|
|
||||||
|
**MIT Zielesystem:**
|
||||||
|
```python
|
||||||
|
def calculate_body_progress_score(goal_mode):
|
||||||
|
fm_delta_score = calculate_fm_change() # -5kg
|
||||||
|
lbm_delta_score = calculate_lbm_change() # -2kg
|
||||||
|
|
||||||
|
if goal_mode == "weight_loss":
|
||||||
|
# FM runter: sehr gut, LBM runter: tolerierbar wenn nicht zu viel
|
||||||
|
return 0.70 * fm_delta_score + 0.30 * lbm_delta_score
|
||||||
|
# Score: 78/100 (FM wichtiger, LBM-Verlust weniger kritisch)
|
||||||
|
|
||||||
|
elif goal_mode == "strength":
|
||||||
|
# FM runter: ok, LBM runter: SEHR SCHLECHT
|
||||||
|
return 0.30 * fm_delta_score + 0.70 * lbm_delta_score
|
||||||
|
# Score: 32/100 (LBM-Verlust ist Hauptproblem!)
|
||||||
|
|
||||||
|
elif goal_mode == "recomposition":
|
||||||
|
# FM runter: gut, LBM runter: schlecht
|
||||||
|
return 0.50 * fm_delta_score + 0.50 * lbm_delta_score
|
||||||
|
# Score: 50/100 (ausgewogen bewertet)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Ergebnis:**
|
||||||
|
- Gleiche Daten (-5kg FM, -2kg LBM)
|
||||||
|
- ABER: 78/100 bei weight_loss, 32/100 bei strength
|
||||||
|
- **Ohne Ziel: völlig falsche Bewertung!**
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 3. Ziel-Erkennung aus Daten
|
||||||
|
|
||||||
|
### Fachkonzept erwähnt dies NICHT explizit, aber logisch ableitbar:
|
||||||
|
|
||||||
|
**Pattern-Erkennung:**
|
||||||
|
```python
|
||||||
|
def suggest_goal_from_data(profile_id):
|
||||||
|
"""Schlägt Ziel basierend auf Daten-Mustern vor."""
|
||||||
|
|
||||||
|
# Analyse der letzten 28 Tage
|
||||||
|
training_types = get_training_distribution_28d(profile_id)
|
||||||
|
nutrition = get_nutrition_pattern_28d(profile_id)
|
||||||
|
body_changes = get_body_changes_28d(profile_id)
|
||||||
|
|
||||||
|
# Pattern 1: Viel Kraft + viel Protein + LBM steigt
|
||||||
|
if (training_types['strength'] > 60% and
|
||||||
|
nutrition['protein_g_per_kg'] > 1.8 and
|
||||||
|
body_changes['lbm_trend'] > 0):
|
||||||
|
return {
|
||||||
|
'suggested_goal': 'strength',
|
||||||
|
'confidence': 'high',
|
||||||
|
'reasoning': 'Krafttraining dominant + hohe Proteinzufuhr + Muskelaufbau erkennbar'
|
||||||
|
}
|
||||||
|
|
||||||
|
# Pattern 2: Viel Cardio + Kaloriendefizit + Gewicht sinkt
|
||||||
|
if (training_types['endurance'] > 50% and
|
||||||
|
nutrition['kcal_balance_avg'] < -300 and
|
||||||
|
body_changes['weight_trend'] < 0):
|
||||||
|
return {
|
||||||
|
'suggested_goal': 'weight_loss',
|
||||||
|
'confidence': 'high',
|
||||||
|
'reasoning': 'Ausdauertraining + Kaloriendefizit + Gewichtsverlust'
|
||||||
|
}
|
||||||
|
|
||||||
|
# Pattern 3: Mixed Training + Protein hoch + Gewicht stabil + Rekomposition
|
||||||
|
if (training_types['mixed'] == True and
|
||||||
|
nutrition['protein_g_per_kg'] > 1.6 and
|
||||||
|
abs(body_changes['weight_trend']) < 0.05 and
|
||||||
|
body_changes['fm_trend'] < 0 and
|
||||||
|
body_changes['lbm_trend'] > 0):
|
||||||
|
return {
|
||||||
|
'suggested_goal': 'recomposition',
|
||||||
|
'confidence': 'medium',
|
||||||
|
'reasoning': 'Gemischtes Training + Rekomposition sichtbar (FM↓, LBM↑)'
|
||||||
|
}
|
||||||
|
|
||||||
|
# Default: Nicht genug Muster erkennbar
|
||||||
|
return {
|
||||||
|
'suggested_goal': 'health',
|
||||||
|
'confidence': 'low',
|
||||||
|
'reasoning': 'Keine klaren Muster erkennbar, gesundheitsorientiertes Training angenommen'
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Voraussetzungen für Ziel-Erkennung:
|
||||||
|
1. ✅ Mindestens 21-28 Tage Daten
|
||||||
|
2. ✅ Training-Type Distribution
|
||||||
|
3. ✅ Ernährungs-Pattern
|
||||||
|
4. ✅ Körper-Trends (FM, LBM, Gewicht)
|
||||||
|
5. ✅ Berechnet → **braucht Platzhalter!**
|
||||||
|
|
||||||
|
**ABER:** Ziel-Erkennung ist **nachgelagert**, nicht Voraussetzung.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 4. Empfohlene Implementierungs-Strategie
|
||||||
|
|
||||||
|
### Hybrid-Ansatz: Minimal-Ziele SOFORT, Voll-System parallel
|
||||||
|
|
||||||
|
## Phase 0a: Minimal-Zielesystem (2-3h) ⭐ **START HIER**
|
||||||
|
|
||||||
|
### Ziel
|
||||||
|
User kann manuell Ziel setzen, System nutzt es für Berechnungen.
|
||||||
|
|
||||||
|
### Implementierung
|
||||||
|
|
||||||
|
**1. DB-Schema erweitern:**
|
||||||
|
```sql
|
||||||
|
-- Migration 023
|
||||||
|
ALTER TABLE profiles ADD COLUMN goal_mode VARCHAR(50) DEFAULT 'health';
|
||||||
|
ALTER TABLE profiles ADD COLUMN goal_weight DECIMAL(5,2);
|
||||||
|
ALTER TABLE profiles ADD COLUMN goal_bf_pct DECIMAL(4,1);
|
||||||
|
ALTER TABLE profiles ADD COLUMN goal_set_date DATE;
|
||||||
|
ALTER TABLE profiles ADD COLUMN goal_target_date DATE;
|
||||||
|
|
||||||
|
COMMENT ON COLUMN profiles.goal_mode IS
|
||||||
|
'Primary goal: weight_loss, strength, endurance, recomposition, health';
|
||||||
|
```
|
||||||
|
|
||||||
|
**2. Goal-Mode Konstanten:**
|
||||||
|
```python
|
||||||
|
# backend/goals.py (NEU)
|
||||||
|
GOAL_MODES = {
|
||||||
|
'weight_loss': {
|
||||||
|
'label': 'Gewichtsreduktion',
|
||||||
|
'description': 'Fettabbau bei Erhalt der Magermasse',
|
||||||
|
'score_weights': {
|
||||||
|
'body_progress': 0.30,
|
||||||
|
'nutrition': 0.25,
|
||||||
|
'activity': 0.20,
|
||||||
|
'recovery': 0.15,
|
||||||
|
'health_risk': 0.10
|
||||||
|
},
|
||||||
|
'focus_areas': ['fettmasse', 'gewichtstrend', 'kalorienbilanz', 'protein_sicherung']
|
||||||
|
},
|
||||||
|
'strength': {
|
||||||
|
'label': 'Kraftaufbau',
|
||||||
|
'description': 'Muskelaufbau und Kraftsteigerung',
|
||||||
|
'score_weights': {
|
||||||
|
'body_progress': 0.20,
|
||||||
|
'nutrition': 0.25,
|
||||||
|
'activity': 0.30,
|
||||||
|
'recovery': 0.20,
|
||||||
|
'health_risk': 0.05
|
||||||
|
},
|
||||||
|
'focus_areas': ['trainingsqualitaet', 'protein', 'lbm', 'recovery']
|
||||||
|
},
|
||||||
|
'endurance': {
|
||||||
|
'label': 'Ausdaueraufbau',
|
||||||
|
'description': 'Kondition und VO2max verbessern',
|
||||||
|
'score_weights': {
|
||||||
|
'body_progress': 0.10,
|
||||||
|
'nutrition': 0.20,
|
||||||
|
'activity': 0.35,
|
||||||
|
'recovery': 0.25,
|
||||||
|
'health_risk': 0.10
|
||||||
|
},
|
||||||
|
'focus_areas': ['trainingsvolumen', 'intensitaetsverteilung', 'vo2max', 'recovery']
|
||||||
|
},
|
||||||
|
'recomposition': {
|
||||||
|
'label': 'Körperrekomposition',
|
||||||
|
'description': 'Fettabbau bei gleichzeitigem Muskelaufbau',
|
||||||
|
'score_weights': {
|
||||||
|
'body_progress': 0.30,
|
||||||
|
'nutrition': 0.25,
|
||||||
|
'activity': 0.25,
|
||||||
|
'recovery': 0.15,
|
||||||
|
'health_risk': 0.05
|
||||||
|
},
|
||||||
|
'focus_areas': ['lbm', 'fettmasse', 'protein', 'trainingsqualitaet']
|
||||||
|
},
|
||||||
|
'health': {
|
||||||
|
'label': 'Allgemeine Gesundheit',
|
||||||
|
'description': 'Ausgeglichenes Gesundheits- und Fitnesstraining',
|
||||||
|
'score_weights': {
|
||||||
|
'body_progress': 0.20,
|
||||||
|
'nutrition': 0.20,
|
||||||
|
'activity': 0.20,
|
||||||
|
'recovery': 0.20,
|
||||||
|
'health_risk': 0.20
|
||||||
|
},
|
||||||
|
'focus_areas': ['bewegung', 'blutdruck', 'schlaf', 'gewicht', 'regelmaessigkeit']
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**3. API-Endpoint:**
|
||||||
|
```python
|
||||||
|
# routers/goals.py (NEU)
|
||||||
|
from fastapi import APIRouter, Depends
|
||||||
|
from auth import require_auth
|
||||||
|
from goals import GOAL_MODES
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/api/goals", tags=["goals"])
|
||||||
|
|
||||||
|
@router.get("/modes")
|
||||||
|
def get_goal_modes():
|
||||||
|
"""Return all available goal modes with descriptions."""
|
||||||
|
return GOAL_MODES
|
||||||
|
|
||||||
|
@router.get("/current")
|
||||||
|
def get_current_goal(session: dict = Depends(require_auth)):
|
||||||
|
"""Get user's current goal settings."""
|
||||||
|
profile_id = session['profile_id']
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"""SELECT goal_mode, goal_weight, goal_bf_pct,
|
||||||
|
goal_set_date, goal_target_date
|
||||||
|
FROM profiles WHERE id=%s""",
|
||||||
|
(profile_id,)
|
||||||
|
)
|
||||||
|
row = r2d(cur.fetchone())
|
||||||
|
return {
|
||||||
|
**row,
|
||||||
|
'mode_config': GOAL_MODES.get(row['goal_mode'], GOAL_MODES['health'])
|
||||||
|
}
|
||||||
|
|
||||||
|
@router.post("/set")
|
||||||
|
def set_goal(
|
||||||
|
goal_mode: str,
|
||||||
|
goal_weight: Optional[float] = None,
|
||||||
|
goal_bf_pct: Optional[float] = None,
|
||||||
|
target_date: Optional[str] = None,
|
||||||
|
session: dict = Depends(require_auth)
|
||||||
|
):
|
||||||
|
"""Set user's goal."""
|
||||||
|
if goal_mode not in GOAL_MODES:
|
||||||
|
raise HTTPException(400, f"Invalid goal_mode. Must be one of: {list(GOAL_MODES.keys())}")
|
||||||
|
|
||||||
|
profile_id = session['profile_id']
|
||||||
|
with get_db() as conn:
|
||||||
|
cur = get_cursor(conn)
|
||||||
|
cur.execute(
|
||||||
|
"""UPDATE profiles
|
||||||
|
SET goal_mode=%s, goal_weight=%s, goal_bf_pct=%s,
|
||||||
|
goal_set_date=CURRENT_DATE, goal_target_date=%s
|
||||||
|
WHERE id=%s""",
|
||||||
|
(goal_mode, goal_weight, goal_bf_pct, target_date, profile_id)
|
||||||
|
)
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
return {"success": True, "goal_mode": goal_mode}
|
||||||
|
```
|
||||||
|
|
||||||
|
**4. Frontend UI (Settings.jsx):**
|
||||||
|
```jsx
|
||||||
|
// Minimal Goal Selector
|
||||||
|
function GoalSettings() {
|
||||||
|
const [goalModes, setGoalModes] = useState({})
|
||||||
|
const [currentGoal, setCurrentGoal] = useState(null)
|
||||||
|
const [selectedMode, setSelectedMode] = useState('health')
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
loadGoalModes()
|
||||||
|
loadCurrentGoal()
|
||||||
|
}, [])
|
||||||
|
|
||||||
|
const loadGoalModes = async () => {
|
||||||
|
const modes = await api.getGoalModes()
|
||||||
|
setGoalModes(modes)
|
||||||
|
}
|
||||||
|
|
||||||
|
const loadCurrentGoal = async () => {
|
||||||
|
const goal = await api.getCurrentGoal()
|
||||||
|
setCurrentGoal(goal)
|
||||||
|
setSelectedMode(goal.goal_mode || 'health')
|
||||||
|
}
|
||||||
|
|
||||||
|
const saveGoal = async () => {
|
||||||
|
await api.setGoal({
|
||||||
|
goal_mode: selectedMode,
|
||||||
|
goal_weight: goalWeight,
|
||||||
|
goal_bf_pct: goalBfPct,
|
||||||
|
target_date: targetDate
|
||||||
|
})
|
||||||
|
loadCurrentGoal()
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="card">
|
||||||
|
<h2>🎯 Trainingsziel</h2>
|
||||||
|
|
||||||
|
<div className="form-row">
|
||||||
|
<label>Hauptziel</label>
|
||||||
|
<select value={selectedMode} onChange={e => setSelectedMode(e.target.value)}>
|
||||||
|
{Object.entries(goalModes).map(([key, config]) => (
|
||||||
|
<option key={key} value={key}>
|
||||||
|
{config.label}
|
||||||
|
</option>
|
||||||
|
))}
|
||||||
|
</select>
|
||||||
|
<p style={{fontSize: 12, color: 'var(--text3)'}}>
|
||||||
|
{goalModes[selectedMode]?.description}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{(selectedMode === 'weight_loss' || selectedMode === 'recomposition') && (
|
||||||
|
<div className="form-row">
|
||||||
|
<label>Zielgewicht (optional)</label>
|
||||||
|
<input type="number" step="0.1" value={goalWeight} onChange={...} />
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<button onClick={saveGoal}>Ziel speichern</button>
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Aufwand: 2-3h
|
||||||
|
- 1h: DB + Backend
|
||||||
|
- 1h: Frontend UI
|
||||||
|
- 0.5h: Testing
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase 0b: Goal-Aware Platzhalter (16-20h)
|
||||||
|
|
||||||
|
**Alle 84 Platzhalter implementieren, ABER:**
|
||||||
|
- Score-Berechnungen nutzen `goal_mode` von Anfang an
|
||||||
|
- Beispiel:
|
||||||
|
|
||||||
|
```python
|
||||||
|
def get_body_progress_score(profile_id: str) -> str:
|
||||||
|
"""Body Progress Score (0-100, goal-dependent)."""
|
||||||
|
profile = get_profile_data(profile_id)
|
||||||
|
goal_mode = profile.get('goal_mode', 'health')
|
||||||
|
|
||||||
|
# Hole Gewichte aus goals.GOAL_MODES
|
||||||
|
weights = GOAL_MODES[goal_mode]['score_weights']
|
||||||
|
|
||||||
|
# Berechne Sub-Scores
|
||||||
|
fm_score = calculate_fm_progress(profile_id)
|
||||||
|
lbm_score = calculate_lbm_progress(profile_id)
|
||||||
|
weight_score = calculate_weight_progress(profile_id, goal_mode)
|
||||||
|
|
||||||
|
# Gewichte nach Ziel
|
||||||
|
if goal_mode == 'weight_loss':
|
||||||
|
total = (0.50 * fm_score + 0.30 * weight_score + 0.20 * lbm_score)
|
||||||
|
elif goal_mode == 'strength':
|
||||||
|
total = (0.60 * lbm_score + 0.30 * fm_score + 0.10 * weight_score)
|
||||||
|
elif goal_mode == 'recomposition':
|
||||||
|
total = (0.45 * fm_score + 0.45 * lbm_score + 0.10 * weight_score)
|
||||||
|
else: # health, endurance
|
||||||
|
total = (0.40 * weight_score + 0.30 * fm_score + 0.30 * lbm_score)
|
||||||
|
|
||||||
|
return f"{int(total)}/100"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Resultat:**
|
||||||
|
- Charts bekommen von Anfang an **korrekte** Scores
|
||||||
|
- Keine Umarbeitung nötig später
|
||||||
|
- System ist "smart" ab Tag 1
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase 2+: Vollständiges Zielesystem (6-8h)
|
||||||
|
|
||||||
|
**Features:**
|
||||||
|
1. **Ziel-Erkennung aus Daten**
|
||||||
|
- Pattern-Analyse (wie oben)
|
||||||
|
- Vorschlag mit Confidence
|
||||||
|
- "Passt dein Ziel noch?" Check
|
||||||
|
|
||||||
|
2. **Sekundäre Ziele**
|
||||||
|
- `goal_mode` = primary
|
||||||
|
- `secondary_goals[]` = weitere Schwerpunkte
|
||||||
|
- Gewichtung: 70% primary, 30% secondary
|
||||||
|
|
||||||
|
3. **Ziel-Progression Tracking**
|
||||||
|
- Fortschritt zum Ziel (%)
|
||||||
|
- Geschätzte Erreichung (Datum)
|
||||||
|
- Anpassungs-Vorschläge
|
||||||
|
|
||||||
|
4. **Goal-Aware Charts**
|
||||||
|
- Priorisierung nach goal_relevance
|
||||||
|
- Dashboard zeigt ziel-spezifische Charts zuerst
|
||||||
|
|
||||||
|
5. **Goal-Aware KI**
|
||||||
|
- Prompt-Kontext enthält goal_mode
|
||||||
|
- KI interpretiert zielspezifisch
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 5. Entscheidungs-Matrix
|
||||||
|
|
||||||
|
### Option A: Zielesystem komplett ZUERST
|
||||||
|
**Aufwand:** 10-12h
|
||||||
|
**Pro:**
|
||||||
|
- Alles konsistent von Anfang an
|
||||||
|
- Keine Umarbeitung
|
||||||
|
**Contra:**
|
||||||
|
- Verzögert Platzhalter-Start
|
||||||
|
- Ziel-Erkennung braucht Platzhalter (Henne-Ei)
|
||||||
|
|
||||||
|
### Option B: Platzhalter ZUERST, dann Ziele
|
||||||
|
**Aufwand:** 16-20h + später Rework
|
||||||
|
**Pro:**
|
||||||
|
- Schneller Start
|
||||||
|
**Contra:**
|
||||||
|
- ALLE Scores falsch gewichtet
|
||||||
|
- Komplette Umarbeitung nötig
|
||||||
|
- User sehen falsche Werte
|
||||||
|
|
||||||
|
### Option C: HYBRID ⭐ **EMPFOHLEN**
|
||||||
|
**Aufwand:** 2-3h (Minimal-Ziele) + 16-20h (Goal-Aware Platzhalter) + später 6-8h (Voll-System)
|
||||||
|
**Pro:**
|
||||||
|
- ✅ Beste aus beiden Welten
|
||||||
|
- ✅ Korrekte Scores von Anfang an
|
||||||
|
- ✅ Keine Umarbeitung
|
||||||
|
- ✅ Ziel-Erkennung später als Enhancement
|
||||||
|
**Contra:**
|
||||||
|
- Keinen signifikanten Nachteil
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 6. Empfehlung
|
||||||
|
|
||||||
|
### JA, Zielesystem VOR Platzhaltern – aber minimal!
|
||||||
|
|
||||||
|
**Reihenfolge:**
|
||||||
|
|
||||||
|
1. **Phase 0a (2-3h):** Minimal-Zielesystem
|
||||||
|
- DB: goal_mode field
|
||||||
|
- API: Get/Set Goal
|
||||||
|
- UI: Goal Selector (Settings)
|
||||||
|
- Default: "health"
|
||||||
|
|
||||||
|
2. **Phase 0b (16-20h):** Goal-Aware Platzhalter
|
||||||
|
- 84 Platzhalter implementieren
|
||||||
|
- Scores nutzen goal_mode
|
||||||
|
- Berechnungen goal-abhängig
|
||||||
|
|
||||||
|
3. **Phase 1 (12-16h):** Charts
|
||||||
|
- Nutzen goal-aware Platzhalter
|
||||||
|
- Zeigen korrekte Interpretationen
|
||||||
|
|
||||||
|
4. **Phase 2+ (6-8h):** Vollständiges Zielesystem
|
||||||
|
- Ziel-Erkennung
|
||||||
|
- Sekundäre Ziele
|
||||||
|
- Goal Progression Tracking
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 7. Fazit
|
||||||
|
|
||||||
|
**Deine Intuition war 100% richtig!**
|
||||||
|
|
||||||
|
✅ **Ohne Zielesystem:**
|
||||||
|
- Charts zeigen falsche Interpretationen
|
||||||
|
- Scores sind generisch und für niemanden passend
|
||||||
|
- System bleibt "dummer Datensammler"
|
||||||
|
|
||||||
|
✅ **Mit Zielesystem:**
|
||||||
|
- Charts interpretieren zielspezifisch
|
||||||
|
- Scores sind individuell gewichtet
|
||||||
|
- System wird "intelligenter Coach"
|
||||||
|
|
||||||
|
**Nächster Schritt:** Phase 0a implementieren (2-3h), dann Phase 0b mit goal-aware Platzhaltern.
|
||||||
|
|
||||||
|
**Soll ich mit Phase 0a (Minimal-Zielesystem) starten?**
|
||||||
729
docs/GOAL_SYSTEM_REDESIGN_v2.md
Normal file
729
docs/GOAL_SYSTEM_REDESIGN_v2.md
Normal file
|
|
@ -0,0 +1,729 @@
|
||||||
|
# Goal System Redesign v2.0
|
||||||
|
|
||||||
|
**Datum:** 26. März 2026
|
||||||
|
**Status:** 📋 KONZEPTION
|
||||||
|
**Anlass:** Fundamentale Design-Probleme in Phase 0a identifiziert
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 1. Probleme der aktuellen Implementierung (Phase 0a)
|
||||||
|
|
||||||
|
### 1.1 Primärziel zu simplistisch
|
||||||
|
**Problem:**
|
||||||
|
- Nur EIN Primärziel erlaubt
|
||||||
|
- Binäres System (primär/nicht-primär)
|
||||||
|
- Toggle funktioniert nicht richtig beim Update
|
||||||
|
|
||||||
|
**Realität:**
|
||||||
|
- User hat MEHRERE Ziele gleichzeitig mit unterschiedlichen Prioritäten
|
||||||
|
- Beispiel: 30% Abnehmen, 25% Kraft, 25% Ausdauer, 20% Beweglichkeit
|
||||||
|
|
||||||
|
**Lösung:**
|
||||||
|
→ **Gewichtungssystem** (0-100%, Summe = 100%)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### 1.2 Ein Goal Mode zu simpel
|
||||||
|
**Problem:**
|
||||||
|
- User muss sich für EINEN Modus entscheiden (weight_loss ODER strength)
|
||||||
|
- In Realität: Kombinierte Ziele (Abnehmen + Kraft + Ausdauer gleichzeitig)
|
||||||
|
|
||||||
|
**Realität (User-Zitat):**
|
||||||
|
> "Ich versuche nach einer Operation Kraft und Ausdauer aufzubauen, gleichzeitig Abzunehmen und meine Beweglichkeit und Koordination wieder zu steigern."
|
||||||
|
|
||||||
|
**Lösung:**
|
||||||
|
→ **Multi-Mode mit Gewichtung** statt Single-Mode
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### 1.3 Fehlende Current Values
|
||||||
|
**Problem:**
|
||||||
|
- `lean_mass` current value = "-" (nicht implementiert)
|
||||||
|
- `strength`, `flexibility` haben keine Datenquellen
|
||||||
|
- VO2Max wirft Internal Server Error
|
||||||
|
|
||||||
|
**Lösung:**
|
||||||
|
→ Alle Goal-Typen mit korrekten Datenquellen verbinden
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### 1.4 Abstrakte Zieltypen
|
||||||
|
**Problem:**
|
||||||
|
- "Kraft" - was bedeutet das? Bankdrücken? Kniebeuge? Gesamt?
|
||||||
|
- "Beweglichkeit" - welcher Test? Sit-and-Reach? Hüftbeugung?
|
||||||
|
- Zu unspezifisch für konkrete Messung
|
||||||
|
|
||||||
|
**Lösung:**
|
||||||
|
→ **Konkrete, messbare Zieltypen** mit standardisierten Tests
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### 1.5 Blutdruck als einzelner Wert
|
||||||
|
**Problem:**
|
||||||
|
- BP braucht ZWEI Werte (systolisch/diastolisch)
|
||||||
|
- Aktuelles Schema: nur ein `target_value`
|
||||||
|
|
||||||
|
**Lösung:**
|
||||||
|
→ **Compound Goals** (Ziele mit mehreren Werten)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### 1.6 Keine Guidance für User
|
||||||
|
**Problem:**
|
||||||
|
- User muss konkrete Zahlen eingeben ohne Kontext
|
||||||
|
- Was ist ein guter VO2Max Wert? Was ist realistisch?
|
||||||
|
|
||||||
|
**Lösung:**
|
||||||
|
→ **Richtwerte, Normen, Beispiele** in UI
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 2. Redesign-Konzept v2.0
|
||||||
|
|
||||||
|
### 2.1 Kern-Prinzipien
|
||||||
|
|
||||||
|
**Prinzip 1: Gewichtung statt Priorisierung**
|
||||||
|
- Alle Ziele haben eine Gewichtung (0-100%)
|
||||||
|
- Summe aller Gewichtungen = 100%
|
||||||
|
- KI berücksichtigt Gewichtung in Analysen
|
||||||
|
|
||||||
|
**Prinzip 2: Multi-dimensional statt Singular**
|
||||||
|
- Kein einzelner "Goal Mode"
|
||||||
|
- Stattdessen: Gewichtete Kombination von Fokus-Bereichen
|
||||||
|
- Realitätsnah: User hat mehrere Ziele gleichzeitig
|
||||||
|
|
||||||
|
**Prinzip 3: Konkret statt Abstrakt**
|
||||||
|
- Jedes Ziel hat klare Messbarkeit
|
||||||
|
- Standardisierte Tests wo möglich
|
||||||
|
- Datenquellen eindeutig definiert
|
||||||
|
|
||||||
|
**Prinzip 4: Guidance statt Ratlosigkeit**
|
||||||
|
- Richtwerte für jedes Ziel
|
||||||
|
- Alters-/Geschlechts-spezifische Normen
|
||||||
|
- Beispiele und Erklärungen
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 3. Neues Datenmodell
|
||||||
|
|
||||||
|
### 3.1 Fokus-Bereiche (statt Goal Modes)
|
||||||
|
|
||||||
|
**Tabelle: `focus_areas` (NEU)**
|
||||||
|
```sql
|
||||||
|
CREATE TABLE focus_areas (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
profile_id UUID NOT NULL REFERENCES profiles(id) ON DELETE CASCADE,
|
||||||
|
|
||||||
|
-- Gewichtete Fokus-Bereiche
|
||||||
|
weight_loss_pct INT DEFAULT 0, -- 0-100%
|
||||||
|
muscle_gain_pct INT DEFAULT 0, -- 0-100%
|
||||||
|
endurance_pct INT DEFAULT 0, -- 0-100%
|
||||||
|
strength_pct INT DEFAULT 0, -- 0-100%
|
||||||
|
flexibility_pct INT DEFAULT 0, -- 0-100%
|
||||||
|
health_pct INT DEFAULT 0, -- 0-100% (Erhaltung, kein spezifisches Ziel)
|
||||||
|
|
||||||
|
-- Constraint: Summe muss 100 sein
|
||||||
|
CONSTRAINT sum_equals_100 CHECK (
|
||||||
|
weight_loss_pct + muscle_gain_pct + endurance_pct +
|
||||||
|
strength_pct + flexibility_pct + health_pct = 100
|
||||||
|
),
|
||||||
|
|
||||||
|
active BOOLEAN DEFAULT true,
|
||||||
|
created_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
updated_at TIMESTAMP DEFAULT NOW(),
|
||||||
|
|
||||||
|
-- Nur ein aktiver Fokus-Mix pro User
|
||||||
|
UNIQUE(profile_id, active) WHERE active = true
|
||||||
|
);
|
||||||
|
|
||||||
|
COMMENT ON TABLE focus_areas IS
|
||||||
|
'Weighted focus distribution - replaces single goal_mode.
|
||||||
|
Example: 30% weight loss + 25% strength + 25% endurance + 20% flexibility = 100%';
|
||||||
|
```
|
||||||
|
|
||||||
|
**Beispiel-Daten:**
|
||||||
|
```json
|
||||||
|
// User nach Operation (wie im Feedback beschrieben):
|
||||||
|
{
|
||||||
|
"weight_loss_pct": 30,
|
||||||
|
"muscle_gain_pct": 20,
|
||||||
|
"endurance_pct": 25,
|
||||||
|
"strength_pct": 15,
|
||||||
|
"flexibility_pct": 10,
|
||||||
|
"health_pct": 0
|
||||||
|
}
|
||||||
|
|
||||||
|
// User reiner Kraftfokus:
|
||||||
|
{
|
||||||
|
"weight_loss_pct": 0,
|
||||||
|
"muscle_gain_pct": 50,
|
||||||
|
"strength_pct": 40,
|
||||||
|
"endurance_pct": 10,
|
||||||
|
"flexibility_pct": 0,
|
||||||
|
"health_pct": 0
|
||||||
|
}
|
||||||
|
|
||||||
|
// User Gewichtsverlust primär:
|
||||||
|
{
|
||||||
|
"weight_loss_pct": 60,
|
||||||
|
"muscle_gain_pct": 0,
|
||||||
|
"endurance_pct": 20,
|
||||||
|
"strength_pct": 10,
|
||||||
|
"flexibility_pct": 5,
|
||||||
|
"health_pct": 5
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### 3.2 Überarbeitete Goal-Typen
|
||||||
|
|
||||||
|
**Tabelle: `goals` (ÜBERARBEITET)**
|
||||||
|
|
||||||
|
**A) Simple Goals (ein Wert):**
|
||||||
|
```sql
|
||||||
|
goal_type:
|
||||||
|
- 'weight' → kg (aus weight_log)
|
||||||
|
- 'body_fat_pct' → % (aus caliper_log)
|
||||||
|
- 'lean_mass' → kg (berechnet: weight - (weight * bf_pct))
|
||||||
|
- 'vo2max' → ml/kg/min (aus vitals_baseline)
|
||||||
|
- 'rhr' → bpm (aus vitals_baseline)
|
||||||
|
- 'hrv' → ms (aus vitals_baseline)
|
||||||
|
```
|
||||||
|
|
||||||
|
**B) Test-based Goals (standardisierte Tests):**
|
||||||
|
```sql
|
||||||
|
goal_type:
|
||||||
|
- 'cooper_test' → Meter (12min Lauf)
|
||||||
|
- 'pushups_max' → Anzahl
|
||||||
|
- 'plank_max' → Sekunden
|
||||||
|
- 'sit_reach' → cm (Beweglichkeit)
|
||||||
|
- 'squat_1rm' → kg (Kraft Unterkörper)
|
||||||
|
- 'bench_1rm' → kg (Kraft Oberkörper)
|
||||||
|
- 'deadlift_1rm' → kg (Kraft Rücken)
|
||||||
|
```
|
||||||
|
|
||||||
|
**C) Compound Goals (mehrere Werte):**
|
||||||
|
```sql
|
||||||
|
goal_type:
|
||||||
|
- 'blood_pressure' → systolic/diastolic (mmHg)
|
||||||
|
→ Braucht: target_value_secondary
|
||||||
|
```
|
||||||
|
|
||||||
|
**Schema-Erweiterung:**
|
||||||
|
```sql
|
||||||
|
ALTER TABLE goals ADD COLUMN goal_weight INT DEFAULT 100;
|
||||||
|
-- Gewichtung dieses Ziels (0-100%)
|
||||||
|
-- Summe aller goal_weight für einen User sollte ~100% sein
|
||||||
|
|
||||||
|
ALTER TABLE goals ADD COLUMN target_value_secondary DECIMAL(10,2);
|
||||||
|
-- Für Compound Goals (z.B. BP diastolisch)
|
||||||
|
|
||||||
|
ALTER TABLE goals ADD COLUMN current_value_secondary DECIMAL(10,2);
|
||||||
|
-- Aktueller Wert für sekundären Target
|
||||||
|
|
||||||
|
ALTER TABLE goals DROP COLUMN is_primary;
|
||||||
|
-- Nicht mehr nötig (wird durch goal_weight ersetzt)
|
||||||
|
|
||||||
|
COMMENT ON COLUMN goals.goal_weight IS
|
||||||
|
'Weight/priority of this goal (0-100%).
|
||||||
|
Higher weight = more important in AI scoring.
|
||||||
|
Sum of all goal_weight should be ~100% per user.';
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### 3.3 Datenquellen-Mapping
|
||||||
|
|
||||||
|
**Korrekte Current-Value Extraktion:**
|
||||||
|
|
||||||
|
```python
|
||||||
|
# backend/routers/goals.py - _get_current_value_for_goal_type()
|
||||||
|
|
||||||
|
GOAL_TYPE_SOURCES = {
|
||||||
|
# Simple values from existing tables
|
||||||
|
'weight': {
|
||||||
|
'table': 'weight_log',
|
||||||
|
'column': 'weight',
|
||||||
|
'order': 'date DESC'
|
||||||
|
},
|
||||||
|
'body_fat_pct': {
|
||||||
|
'table': 'caliper_log',
|
||||||
|
'column': 'body_fat_pct',
|
||||||
|
'order': 'date DESC'
|
||||||
|
},
|
||||||
|
'lean_mass': {
|
||||||
|
'calculation': 'weight - (weight * body_fat_pct / 100)',
|
||||||
|
'requires': ['weight_log', 'caliper_log']
|
||||||
|
},
|
||||||
|
'vo2max': {
|
||||||
|
'table': 'vitals_baseline',
|
||||||
|
'column': 'vo2_max',
|
||||||
|
'order': 'date DESC'
|
||||||
|
},
|
||||||
|
'rhr': {
|
||||||
|
'table': 'vitals_baseline',
|
||||||
|
'column': 'resting_hr',
|
||||||
|
'order': 'date DESC'
|
||||||
|
},
|
||||||
|
'hrv': {
|
||||||
|
'table': 'vitals_baseline',
|
||||||
|
'column': 'hrv',
|
||||||
|
'order': 'date DESC'
|
||||||
|
},
|
||||||
|
|
||||||
|
# Test-based values from fitness_tests
|
||||||
|
'cooper_test': {
|
||||||
|
'table': 'fitness_tests',
|
||||||
|
'filter': "test_type = 'cooper_12min'",
|
||||||
|
'column': 'result_value',
|
||||||
|
'order': 'test_date DESC'
|
||||||
|
},
|
||||||
|
'pushups_max': {
|
||||||
|
'table': 'fitness_tests',
|
||||||
|
'filter': "test_type = 'pushups_max'",
|
||||||
|
'column': 'result_value',
|
||||||
|
'order': 'test_date DESC'
|
||||||
|
},
|
||||||
|
# ... weitere Tests
|
||||||
|
|
||||||
|
# Compound goals
|
||||||
|
'blood_pressure': {
|
||||||
|
'table': 'blood_pressure_log',
|
||||||
|
'columns': ['systolic', 'diastolic'], # Beide Werte
|
||||||
|
'order': 'measured_at DESC'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 4. UI/UX Redesign
|
||||||
|
|
||||||
|
### 4.1 Fokus-Bereiche Konfigurator
|
||||||
|
|
||||||
|
**Statt 5 einzelne Cards → Slider-Interface:**
|
||||||
|
|
||||||
|
```
|
||||||
|
┌─────────────────────────────────────────────────────┐
|
||||||
|
│ 🎯 Mein Trainings-Fokus │
|
||||||
|
├─────────────────────────────────────────────────────┤
|
||||||
|
│ Verschiebe die Regler um deine Prioritäten zu │
|
||||||
|
│ setzen. Die Summe muss 100% ergeben. │
|
||||||
|
│ │
|
||||||
|
│ 📉 Gewichtsverlust [====] 30% │
|
||||||
|
│ Schwerpunkt auf Kaloriendefizit & Fettabbau │
|
||||||
|
│ │
|
||||||
|
│ 💪 Muskelaufbau [===] 20% │
|
||||||
|
│ Magermasse steigern, Körperkomposition │
|
||||||
|
│ │
|
||||||
|
│ 🏃 Ausdauer [====] 25% │
|
||||||
|
│ VO2Max, aerobe Kapazität, Pace │
|
||||||
|
│ │
|
||||||
|
│ 🏋️ Maximalkraft [==] 15% │
|
||||||
|
│ 1RM Steigerung, progressive Belastung │
|
||||||
|
│ │
|
||||||
|
│ 🤸 Beweglichkeit [=] 10% │
|
||||||
|
│ Mobilität, Flexibilität, Koordination │
|
||||||
|
│ │
|
||||||
|
│ ❤️ Allgemeine Gesundheit [ ] 0% │
|
||||||
|
│ Erhaltung, präventiv │
|
||||||
|
│ │
|
||||||
|
│ ────────────────────────────────────────────────── │
|
||||||
|
│ Gesamt: 100% ✓ │
|
||||||
|
│ │
|
||||||
|
│ [Speichern] [Zurücksetzen] │
|
||||||
|
└─────────────────────────────────────────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
**Technisch:**
|
||||||
|
- HTML Range Slider (0-100)
|
||||||
|
- Live-Update der Summe
|
||||||
|
- Validierung: Summe muss 100% sein
|
||||||
|
- Auto-Adjust: Wenn User einen Slider erhöht, andere proportional reduzieren
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### 4.2 Ziele mit Gewichtung
|
||||||
|
|
||||||
|
**Goal-List mit Gewichtungs-Indikator:**
|
||||||
|
|
||||||
|
```
|
||||||
|
┌─────────────────────────────────────────────────────┐
|
||||||
|
│ 🎯 Konkrete Ziele │
|
||||||
|
├─────────────────────────────────────────────────────┤
|
||||||
|
│ ┌───────────────────────────────────────────────┐ │
|
||||||
|
│ │ ⚖️ Zielgewicht: 82 kg [30%]│ │
|
||||||
|
│ │ Start: 95 kg → Aktuell: 89 kg → Ziel: 82 kg │ │
|
||||||
|
│ │ ████████████░░░░░░░░░░ 65% │ │
|
||||||
|
│ │ ✓ Voraussichtlich: 15.05.2026 (on track) │ │
|
||||||
|
│ │ [✏️] [🗑️] │ │
|
||||||
|
│ └───────────────────────────────────────────────┘ │
|
||||||
|
│ │
|
||||||
|
│ ┌───────────────────────────────────────────────┐ │
|
||||||
|
│ │ 💪 Magermasse: 72 kg [20%]│ │
|
||||||
|
│ │ Start: 68 kg → Aktuell: 70.5 kg → Ziel: 72 kg│ │
|
||||||
|
│ │ ██████████░░░░░░░░░░░░ 63% │ │
|
||||||
|
│ │ ⚠ Prognose: 20.06.2026 (5 Tage später) │ │
|
||||||
|
│ │ [✏️] [🗑️] │ │
|
||||||
|
│ └───────────────────────────────────────────────┘ │
|
||||||
|
│ │
|
||||||
|
│ [+ Neues Ziel] │
|
||||||
|
└─────────────────────────────────────────────────────┘
|
||||||
|
|
||||||
|
Summe Gewichtungen: 50% (noch 50% verfügbar)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Änderungen:**
|
||||||
|
- Gewichtung in `[30%]` Badge angezeigt
|
||||||
|
- Summe unten angezeigt
|
||||||
|
- Warnung wenn Summe > 100%
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### 4.3 Ziel-Editor mit Guidance
|
||||||
|
|
||||||
|
**Beispiel: VO2Max Ziel erstellen:**
|
||||||
|
|
||||||
|
```
|
||||||
|
┌─────────────────────────────────────────────────────┐
|
||||||
|
│ Neues Ziel erstellen │
|
||||||
|
├─────────────────────────────────────────────────────┤
|
||||||
|
│ Zieltyp │
|
||||||
|
│ [VO2 Max ▼] │
|
||||||
|
│ │
|
||||||
|
│ ℹ️ VO2 Max (ml/kg/min) - Maximale Sauerstoffauf- │
|
||||||
|
│ nahme. Misst die aerobe Leistungsfähigkeit. │
|
||||||
|
│ │
|
||||||
|
│ 📊 Richtwerte (Männer, 35 Jahre): │
|
||||||
|
│ Sehr gut: > 48 ml/kg/min │
|
||||||
|
│ Gut: 44-48 ml/kg/min │
|
||||||
|
│ Durchschn.: 40-44 ml/kg/min │
|
||||||
|
│ Unterdurch.: 35-40 ml/kg/min │
|
||||||
|
│ │
|
||||||
|
│ 🎯 Zielwert │
|
||||||
|
│ ┌──────────┬──────────┐ │
|
||||||
|
│ │ [ 46 ] │ ml/kg/min│ │
|
||||||
|
│ └──────────┴──────────┘ │
|
||||||
|
│ Dein aktueller Wert: 42 ml/kg/min (Durchschnitt) │
|
||||||
|
│ → Ziel liegt in "Gut"-Bereich ✓ │
|
||||||
|
│ │
|
||||||
|
│ 📅 Zieldatum (optional) │
|
||||||
|
│ [2026-06-30] │
|
||||||
|
│ │
|
||||||
|
│ ⚖️ Gewichtung │
|
||||||
|
│ [==== ] 25% │
|
||||||
|
│ Wie wichtig ist dir dieses Ziel? │
|
||||||
|
│ │
|
||||||
|
│ 💡 Name (optional) │
|
||||||
|
│ [Ausdauer für Bergwandern ] │
|
||||||
|
│ │
|
||||||
|
│ [Ziel erstellen] [Abbrechen] │
|
||||||
|
└─────────────────────────────────────────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
**Features:**
|
||||||
|
- Info-Box mit Erklärung
|
||||||
|
- Alters-/geschlechtsspezifische Richtwerte
|
||||||
|
- Live-Feedback zum eingegebenen Wert
|
||||||
|
- Aktueller Wert automatisch geladen
|
||||||
|
- Gewichtungs-Slider mit Live-Preview
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### 4.4 Compound Goals (Blutdruck)
|
||||||
|
|
||||||
|
**Spezial-UI für Blutdruck:**
|
||||||
|
|
||||||
|
```
|
||||||
|
┌─────────────────────────────────────────────────────┐
|
||||||
|
│ Zieltyp: Blutdruck │
|
||||||
|
├─────────────────────────────────────────────────────┤
|
||||||
|
│ 🎯 Zielwerte │
|
||||||
|
│ │
|
||||||
|
│ Systolisch (oberer Wert) │
|
||||||
|
│ [ 120 ] mmHg │
|
||||||
|
│ │
|
||||||
|
│ Diastolisch (unterer Wert) │
|
||||||
|
│ [ 80 ] mmHg │
|
||||||
|
│ │
|
||||||
|
│ ℹ️ WHO/ISH Klassifikation: │
|
||||||
|
│ Optimal: < 120/80 mmHg │
|
||||||
|
│ Normal: 120-129 / 80-84 mmHg │
|
||||||
|
│ Hoch-norm.: 130-139 / 85-89 mmHg │
|
||||||
|
│ Hypertonie: ≥ 140/90 mmHg │
|
||||||
|
│ │
|
||||||
|
│ Dein aktueller Wert: 135/88 mmHg (Hoch-normal) │
|
||||||
|
│ Dein Ziel: 120/80 mmHg (Optimal) ✓ │
|
||||||
|
│ │
|
||||||
|
│ [Ziel erstellen] [Abbrechen] │
|
||||||
|
└─────────────────────────────────────────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 5. Scoring-System mit Gewichtung
|
||||||
|
|
||||||
|
### 5.1 Score-Berechnung v2.0
|
||||||
|
|
||||||
|
**Aktuell (Phase 0a):**
|
||||||
|
```python
|
||||||
|
# Feste Gewichtung per goal_mode
|
||||||
|
SCORE_WEIGHTS = {
|
||||||
|
"strength": {
|
||||||
|
"body_progress": 0.35,
|
||||||
|
"nutrition": 0.30,
|
||||||
|
# ...
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Neu (v2.0):**
|
||||||
|
```python
|
||||||
|
def calculate_weighted_score(profile_id):
|
||||||
|
"""
|
||||||
|
Berechnet Score basierend auf:
|
||||||
|
1. Focus Areas (Multi-dimensional statt single mode)
|
||||||
|
2. Goal Weights (individuelle Ziel-Gewichtungen)
|
||||||
|
"""
|
||||||
|
|
||||||
|
# 1. Hole Focus Areas
|
||||||
|
focus = get_focus_areas(profile_id)
|
||||||
|
# → {weight_loss: 30%, muscle_gain: 20%, endurance: 25%, ...}
|
||||||
|
|
||||||
|
# 2. Hole alle Ziele mit Gewichtung
|
||||||
|
goals = get_goals_with_weights(profile_id)
|
||||||
|
# → [{type: 'weight', weight: 30%}, {type: 'lean_mass', weight: 20%}, ...]
|
||||||
|
|
||||||
|
# 3. Berechne Basis-Scores
|
||||||
|
base_scores = {
|
||||||
|
'body_composition': calculate_body_score(profile_id),
|
||||||
|
'nutrition': calculate_nutrition_score(profile_id),
|
||||||
|
'training': calculate_training_score(profile_id),
|
||||||
|
'recovery': calculate_recovery_score(profile_id)
|
||||||
|
}
|
||||||
|
|
||||||
|
# 4. Gewichte Scores nach Focus Areas
|
||||||
|
weighted_score = 0
|
||||||
|
|
||||||
|
# Weight Loss Focus → Body Composition + Nutrition wichtiger
|
||||||
|
if focus['weight_loss_pct'] > 0:
|
||||||
|
weighted_score += (
|
||||||
|
base_scores['body_composition'] * 0.4 +
|
||||||
|
base_scores['nutrition'] * 0.4 +
|
||||||
|
base_scores['training'] * 0.1 +
|
||||||
|
base_scores['recovery'] * 0.1
|
||||||
|
) * (focus['weight_loss_pct'] / 100)
|
||||||
|
|
||||||
|
# Muscle Gain Focus → Body + Nutrition + Training
|
||||||
|
if focus['muscle_gain_pct'] > 0:
|
||||||
|
weighted_score += (
|
||||||
|
base_scores['body_composition'] * 0.35 +
|
||||||
|
base_scores['nutrition'] * 0.35 +
|
||||||
|
base_scores['training'] * 0.25 +
|
||||||
|
base_scores['recovery'] * 0.05
|
||||||
|
) * (focus['muscle_gain_pct'] / 100)
|
||||||
|
|
||||||
|
# Endurance Focus → Training + Recovery
|
||||||
|
if focus['endurance_pct'] > 0:
|
||||||
|
weighted_score += (
|
||||||
|
base_scores['training'] * 0.50 +
|
||||||
|
base_scores['recovery'] * 0.30 +
|
||||||
|
base_scores['body_composition'] * 0.10 +
|
||||||
|
base_scores['nutrition'] * 0.10
|
||||||
|
) * (focus['endurance_pct'] / 100)
|
||||||
|
|
||||||
|
# ... weitere Focus Areas
|
||||||
|
|
||||||
|
return {
|
||||||
|
'overall_score': round(weighted_score, 1),
|
||||||
|
'base_scores': base_scores,
|
||||||
|
'focus_weights': focus,
|
||||||
|
'goal_weights': [g['weight'] for g in goals]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Beispiel:**
|
||||||
|
```python
|
||||||
|
User: 30% Weight Loss + 25% Endurance + 20% Muscle Gain + 25% Strength
|
||||||
|
|
||||||
|
Base Scores:
|
||||||
|
- Body Composition: 75/100
|
||||||
|
- Nutrition: 80/100
|
||||||
|
- Training: 70/100
|
||||||
|
- Recovery: 65/100
|
||||||
|
|
||||||
|
Calculation:
|
||||||
|
Weight Loss (30%):
|
||||||
|
= (75*0.4 + 80*0.4 + 70*0.1 + 65*0.1) * 0.30
|
||||||
|
= 69.5 * 0.30 = 20.85
|
||||||
|
|
||||||
|
Endurance (25%):
|
||||||
|
= (70*0.50 + 65*0.30 + 75*0.10 + 80*0.10) * 0.25
|
||||||
|
= 69.0 * 0.25 = 17.25
|
||||||
|
|
||||||
|
Muscle Gain (20%):
|
||||||
|
= (75*0.35 + 80*0.35 + 70*0.25 + 65*0.05) * 0.20
|
||||||
|
= 74.0 * 0.20 = 14.80
|
||||||
|
|
||||||
|
Strength (25%):
|
||||||
|
= (70*0.40 + 80*0.30 + 75*0.20 + 65*0.10) * 0.25
|
||||||
|
= 72.5 * 0.25 = 18.13
|
||||||
|
|
||||||
|
Overall Score = 20.85 + 17.25 + 14.80 + 18.13 = 71.03/100
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 6. Migration-Strategie
|
||||||
|
|
||||||
|
### 6.1 Daten-Migration von Phase 0a
|
||||||
|
|
||||||
|
**Bestehende Daten:**
|
||||||
|
- `profiles.goal_mode` (single mode)
|
||||||
|
- `goals` mit `is_primary`
|
||||||
|
|
||||||
|
**Migrations-Logik:**
|
||||||
|
```sql
|
||||||
|
-- Migration 023: Goal System Redesign v2.0
|
||||||
|
|
||||||
|
-- 1. Erstelle focus_areas Tabelle
|
||||||
|
CREATE TABLE focus_areas (...);
|
||||||
|
|
||||||
|
-- 2. Migriere bestehende goal_mode → focus_areas
|
||||||
|
INSERT INTO focus_areas (profile_id, weight_loss_pct, muscle_gain_pct, ...)
|
||||||
|
SELECT
|
||||||
|
id,
|
||||||
|
CASE goal_mode
|
||||||
|
WHEN 'weight_loss' THEN 70 -- 70% Weight Loss + 15% Health + 15% Endurance
|
||||||
|
WHEN 'strength' THEN 0
|
||||||
|
-- ...
|
||||||
|
END as weight_loss_pct,
|
||||||
|
CASE goal_mode
|
||||||
|
WHEN 'strength' THEN 60
|
||||||
|
WHEN 'recomposition' THEN 30
|
||||||
|
-- ...
|
||||||
|
END as muscle_gain_pct,
|
||||||
|
-- ... weitere
|
||||||
|
FROM profiles
|
||||||
|
WHERE goal_mode IS NOT NULL;
|
||||||
|
|
||||||
|
-- 3. Erweitere goals Tabelle
|
||||||
|
ALTER TABLE goals ADD COLUMN goal_weight INT DEFAULT 100;
|
||||||
|
ALTER TABLE goals ADD COLUMN target_value_secondary DECIMAL(10,2);
|
||||||
|
ALTER TABLE goals ADD COLUMN current_value_secondary DECIMAL(10,2);
|
||||||
|
|
||||||
|
-- 4. Migriere is_primary → goal_weight
|
||||||
|
UPDATE goals SET goal_weight = 100 WHERE is_primary = true;
|
||||||
|
UPDATE goals SET goal_weight = 50 WHERE is_primary = false;
|
||||||
|
|
||||||
|
-- 5. Cleanup (später)
|
||||||
|
-- ALTER TABLE profiles DROP COLUMN goal_mode; -- nach Verifikation
|
||||||
|
-- ALTER TABLE goals DROP COLUMN is_primary; -- nach Verifikation
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 7. Implementierungs-Phasen
|
||||||
|
|
||||||
|
### Phase 1: Konzeption ✅ (DIESES DOKUMENT)
|
||||||
|
**Dauer:** -
|
||||||
|
**Ziel:** Vollständiges Redesign-Konzept
|
||||||
|
|
||||||
|
### Phase 2: Backend Redesign (6-8h)
|
||||||
|
- Migration 023 erstellen
|
||||||
|
- `focus_areas` Tabelle + CRUD
|
||||||
|
- `goals` erweitern (weight, secondary values)
|
||||||
|
- Datenquellen-Mapping korrigieren (lean_mass, VO2Max fix, etc.)
|
||||||
|
- Scoring-System v2.0 implementieren
|
||||||
|
|
||||||
|
### Phase 3: Frontend Redesign (8-10h)
|
||||||
|
- Fokus-Bereiche Slider-UI
|
||||||
|
- Ziel-Editor mit Guidance (Richtwerte, Normen)
|
||||||
|
- Gewichtungs-System in Goal-Liste
|
||||||
|
- Compound Goals UI (Blutdruck zwei Werte)
|
||||||
|
- Neue Goal-Typen (Tests) integrieren
|
||||||
|
|
||||||
|
### Phase 4: Testing & Refinement (2-3h)
|
||||||
|
- Migration testen (Phase 0a → v2.0)
|
||||||
|
- Scoring-Logik verifizieren
|
||||||
|
- UI/UX Testing
|
||||||
|
- Edge Cases (Summe ≠ 100%, keine Ziele, etc.)
|
||||||
|
|
||||||
|
**Total: 16-21h**
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 8. Offene Fragen / Entscheidungen
|
||||||
|
|
||||||
|
### 8.1 Focus Areas vs Goals Weight
|
||||||
|
**Frage:** Brauchen wir BEIDE Gewichtungssysteme?
|
||||||
|
- Focus Areas (Weight Loss 30%, Strength 25%, ...)
|
||||||
|
- Goal Weights (Ziel "82kg" = 30%, Ziel "VO2Max 46" = 25%, ...)
|
||||||
|
|
||||||
|
**Option A:** NUR Focus Areas
|
||||||
|
- Einfacher
|
||||||
|
- Weniger Redundanz
|
||||||
|
- Aber: Weniger granular
|
||||||
|
|
||||||
|
**Option B:** BEIDE Systeme
|
||||||
|
- Focus Areas = Strategisch (Richtung)
|
||||||
|
- Goal Weights = Taktisch (konkrete Prioritäten)
|
||||||
|
- Komplexer, aber flexibler
|
||||||
|
|
||||||
|
**Empfehlung:** Option B - beide Systeme ergänzen sich
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### 8.2 Konkrete vs Abstrakte Tests
|
||||||
|
**Frage:** Wie konkret sollen Strength-Goals sein?
|
||||||
|
|
||||||
|
**Option A:** Sehr konkret
|
||||||
|
- `bench_press_1rm`, `squat_1rm`, `deadlift_1rm`
|
||||||
|
- Vorteil: Präzise, messbar
|
||||||
|
- Nachteil: Viele Goal-Typen
|
||||||
|
|
||||||
|
**Option B:** Abstrakt mit Kontext
|
||||||
|
- `strength` mit Sub-Type (Bench/Squat/Deadlift)
|
||||||
|
- Vorteil: Flexibler
|
||||||
|
- Nachteil: Komplizierteres Schema
|
||||||
|
|
||||||
|
**Empfehlung:** Option A - konkrete Typen, dafür klare Messbarkeit
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### 8.3 Auto-Update von Current Values
|
||||||
|
**Frage:** Wie oft sollen current_value aktualisiert werden?
|
||||||
|
|
||||||
|
**Option A:** On-Demand (beim Laden der Goals-Seite)
|
||||||
|
- Vorteil: Keine Background-Jobs
|
||||||
|
- Nachteil: Kann verzögert sein
|
||||||
|
|
||||||
|
**Option B:** Trigger-basiert (bei neuem Messwert)
|
||||||
|
- Vorteil: Immer aktuell
|
||||||
|
- Nachteil: Mehr Komplexität
|
||||||
|
|
||||||
|
**Empfehlung:** Option A für MVP, Option B später
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 9. Nächste Schritte
|
||||||
|
|
||||||
|
### User-Feedback einholen:
|
||||||
|
1. ✅ Löst das Redesign alle genannten Probleme?
|
||||||
|
2. ✅ Ist die Fokus-Bereiche UI verständlich?
|
||||||
|
3. ✅ Sind die konkreten Goal-Typen sinnvoll?
|
||||||
|
4. ✅ Brauchen wir beide Gewichtungssysteme?
|
||||||
|
5. ✅ Fehlt noch etwas?
|
||||||
|
|
||||||
|
### Nach Freigabe:
|
||||||
|
1. Migration 023 schreiben
|
||||||
|
2. Backend implementieren
|
||||||
|
3. Frontend implementieren
|
||||||
|
4. Testing
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Erstellt:** 26. März 2026
|
||||||
|
**Status:** 📋 WARTET AUF FEEDBACK
|
||||||
|
**Nächster Schritt:** User-Review & Freigabe
|
||||||
458
docs/KONZEPT_ANALYSE_2026-03-26.md
Normal file
458
docs/KONZEPT_ANALYSE_2026-03-26.md
Normal file
|
|
@ -0,0 +1,458 @@
|
||||||
|
# Konzept-Analyse: Fachkonzept vs. Gitea Issues
|
||||||
|
|
||||||
|
**Datum:** 26. März 2026
|
||||||
|
**Analyst:** Claude Code
|
||||||
|
**Basis:** `.claude/docs/functional/mitai_jinkendo_konzept_diagramme_auswertungen_v2.md`
|
||||||
|
**Geprüfte Issues:** #26, #27, alle offenen
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 1. Executive Summary
|
||||||
|
|
||||||
|
### Kernerkenntnis
|
||||||
|
Das Fachkonzept ist **wesentlich umfassender** als die aktuellen Gitea Issues #26 und #27. Es definiert ein 3-stufiges Analyse-System (Deskriptiv → Diagnostisch → Präskriptiv), das weit über einfache Charts und Korrelationen hinausgeht.
|
||||||
|
|
||||||
|
### Strategische Empfehlung
|
||||||
|
**NICHT** Issues #26 und #27 einzeln implementieren, sondern:
|
||||||
|
1. **Neu-Strukturierung:** Konzept-basierte Phasen-Issues erstellen
|
||||||
|
2. **Platzhalter-First:** Erst Berechnungs-Platzhalter implementieren
|
||||||
|
3. **Dann Visualisierung:** Charts nutzen die Platzhalter
|
||||||
|
4. **Dann KI-Integration:** KI nutzt regelbasierte Scores + Rohdaten
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 2. Analyse: Issue #26 vs. Fachkonzept
|
||||||
|
|
||||||
|
### Issue #26: Charts & Visualisierungen erweitern
|
||||||
|
**Status:** OPEN
|
||||||
|
**Priority:** Medium-High
|
||||||
|
**Aufwand:** 8-10h
|
||||||
|
|
||||||
|
**Definierte Charts:**
|
||||||
|
- Gewicht-Trends (Line-Chart + Trendlinie)
|
||||||
|
- Umfänge-Verlauf (Multi-Line)
|
||||||
|
- Vitalwerte-Trends (RHR, HRV, BP)
|
||||||
|
- Schlaf-Analyse (Dauer, Phasen)
|
||||||
|
- Ernährungs-Charts (Kalorien, Makros)
|
||||||
|
|
||||||
|
### Fachkonzept: Diagrammkatalog
|
||||||
|
|
||||||
|
**KÖRPER (K1-K5):**
|
||||||
|
- K1: Gewichtstrend + Trendkanal + Zielprojektion
|
||||||
|
- 7d Rolling Median, 28d/90d Trend-Slope
|
||||||
|
- Prozentuale Zielannäherung
|
||||||
|
- Regelbasierte Hinweise (zu schnell/langsam)
|
||||||
|
- K2: Körperzusammensetzung (Gewicht/FM/LBM)
|
||||||
|
- FM = Gewicht × BF%, LBM = Gewicht × (1-BF%)
|
||||||
|
- 28d/90d Änderung von FM und LBM
|
||||||
|
- K3: Umfangs-Panel (8 Mini-Charts)
|
||||||
|
- Links-Rechts Asymmetrie
|
||||||
|
- Taille/Hüfte, Taille/Körpergröße
|
||||||
|
- K4: Rekompositions-Detektor (Quadranten)
|
||||||
|
- K5: Body Progress Score (0-100)
|
||||||
|
|
||||||
|
**ERNÄHRUNG (E1-E5):**
|
||||||
|
- E1: Energieaufnahme vs. Verbrauch vs. Gewichtstrend
|
||||||
|
- E2: Protein adequacy (g/Tag, g/kg, g/kg LBM)
|
||||||
|
- E3: Makroverteilung + Wochenkonsistenz
|
||||||
|
- E4: Ernährungs-Adhärenz-Score (0-100)
|
||||||
|
- E5: Energieverfügbarkeits-Warnung
|
||||||
|
|
||||||
|
**AKTIVITÄT (A1-A8):**
|
||||||
|
- A1: Trainingsvolumen pro Woche
|
||||||
|
- A2: Intensitätsverteilung / Zonenbild
|
||||||
|
- A3: Trainingsqualitäts-Matrix
|
||||||
|
- A4: Fähigkeiten-Balance / Ability Radar
|
||||||
|
- A5: Load-Monitoring (interne Last, Monotony, Strain)
|
||||||
|
- A6: Aktivitäts-Goal-Alignment-Score (0-100)
|
||||||
|
- A7: Ruhetags-/Recovery-Compliance
|
||||||
|
- A8: VO2max-Entwicklung
|
||||||
|
|
||||||
|
### Bewertung
|
||||||
|
❌ **Issue #26 ist zu eng gefasst**
|
||||||
|
- Fokus nur auf Basis-Visualisierung
|
||||||
|
- Keine Scores, keine Baselines, keine Confidence
|
||||||
|
- Keine regelbasierten Hinweise
|
||||||
|
- Keine Ziel-Abhängigkeit
|
||||||
|
|
||||||
|
✅ **Fachkonzept bietet:**
|
||||||
|
- 18 dedizierte Charts (K1-K5, E1-E5, A1-A8)
|
||||||
|
- Scores als eigenständige Visualisierungen
|
||||||
|
- Regelbasierte Aussagen ohne KI
|
||||||
|
- Ziel-Modi Steuerung
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 3. Analyse: Issue #27 vs. Fachkonzept
|
||||||
|
|
||||||
|
### Issue #27: Korrelationen & Insights erweitern
|
||||||
|
**Status:** OPEN
|
||||||
|
**Priority:** High
|
||||||
|
**Aufwand:** 6-8h
|
||||||
|
|
||||||
|
**Definierte Korrelationen:**
|
||||||
|
- Schlaf ↔ Erholung (Schlafdauer → RHR, Qualität → HRV)
|
||||||
|
- Training ↔ Vitalwerte (Load → RHR-Anstieg, HRV-Abfall)
|
||||||
|
- Ernährung ↔ Performance (Defizit → Intensität)
|
||||||
|
- Blutdruck ↔ Lifestyle (Stress → BP, Training → BP)
|
||||||
|
- Multi-Faktor Analyse (KI-Insights)
|
||||||
|
|
||||||
|
### Fachkonzept: Korrelationen (C1-C6)
|
||||||
|
|
||||||
|
**KORRELATIONEN (C1-C6):**
|
||||||
|
- C1: Energie-Balance vs. Gewichtsveränderung (lagged)
|
||||||
|
- Lags: 0, 3, 7, 10, 14 Tage
|
||||||
|
- Bestes Lag ermitteln, Effektstärke, Confidence
|
||||||
|
- C2: Protein adequacy vs. LBM-Trend
|
||||||
|
- 28d Fenstervergleich, Training als Moderator
|
||||||
|
- C3: Trainingslast vs. HRV/RHR (1-3 Tage verzögert)
|
||||||
|
- Duale Lag-Auswertung, individuelle Ermüdungsreaktion
|
||||||
|
- C4: Schlafdauer + Schlafregularität vs. Recovery
|
||||||
|
- Bubble-Chart, Sleep Regularity Index
|
||||||
|
- C5: Blutdruck-Kontextmatrix (Kontext-abhängig)
|
||||||
|
- Messkontext, Schlaf Vor-Nacht, Training
|
||||||
|
- C6: Plateau-Detektor (Ereignis-Karte)
|
||||||
|
- Ziel-spezifische Plateau-Definitionen
|
||||||
|
|
||||||
|
### Zusätzlich: Lag-Analyse Prinzipien
|
||||||
|
|
||||||
|
**Zwingend im Fachkonzept:**
|
||||||
|
- **NIE nur lag=0 prüfen**
|
||||||
|
- Kalorienbilanz → Gewicht: 2-14 Tage Verzögerung
|
||||||
|
- Protein/Krafttraining → LBM: 2-6 Wochen Verzögerung
|
||||||
|
- Trainingslast → HRV/RHR: 1-3 Tage Verzögerung
|
||||||
|
- Schlafdefizit → Recovery: 1-3 Tage Verzögerung
|
||||||
|
|
||||||
|
**Mindestdatenmenge:**
|
||||||
|
- Korrelationen: mind. 21 gepaarte Tageswerte
|
||||||
|
- Lag-basiert: mind. 28 gepaarte Tage
|
||||||
|
- Confidence-Klassen (hoch/mittel/niedrig/nicht auswertbar)
|
||||||
|
|
||||||
|
### Bewertung
|
||||||
|
❌ **Issue #27 ist zu oberflächlich**
|
||||||
|
- Keine Lag-Analyse
|
||||||
|
- Keine Confidence-Bewertung
|
||||||
|
- Keine Mindestdatenmenge-Checks
|
||||||
|
- Keine Ziel-Abhängigkeit
|
||||||
|
|
||||||
|
✅ **Fachkonzept bietet:**
|
||||||
|
- 6 dedizierte Korrelations-Charts mit Lag-Analyse
|
||||||
|
- Explizite Confidence-Bewertung
|
||||||
|
- Medizinischer Sicherheitsmodus
|
||||||
|
- Plateau-Detektion (regelbasiert)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 4. Konflikt-Analyse
|
||||||
|
|
||||||
|
### Gibt es Widersprüche zwischen #26 und #27?
|
||||||
|
**NEIN** – Sie sind komplementär:
|
||||||
|
- #26: Deskriptive Ebene (Charts)
|
||||||
|
- #27: Diagnostische Ebene (Korrelationen)
|
||||||
|
|
||||||
|
### Aber: Beide sind zu isoliert
|
||||||
|
Das Fachkonzept zeigt: **Charts und Korrelationen müssen verzahnt sein**
|
||||||
|
|
||||||
|
**Beispiel:**
|
||||||
|
```
|
||||||
|
Fachkonzept C1: Energie-Balance vs. Gewichtsveränderung
|
||||||
|
├─ Visualisierung: Lag-Heatmap (diagnostisch)
|
||||||
|
├─ Berechnung: Cross-Correlation (0, 3, 7, 10, 14 Tage Lags)
|
||||||
|
├─ Input-Daten: Tägliche Kalorienbilanz (E-Chart)
|
||||||
|
├─ Input-Daten: 7d Gewichtsänderung (K-Chart)
|
||||||
|
└─ Regelbasierte Aussage: "Energiebilanz zeigt sich bei dir nach ~7 Tagen im Gewicht"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Fazit:** Charts (K, E, A) liefern Basis-Daten für Korrelationen (C)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 5. Neue Platzhalter aus Fachkonzept
|
||||||
|
|
||||||
|
### 5.1 KÖRPER (18 neue Platzhalter)
|
||||||
|
|
||||||
|
**Gewicht & Trends:**
|
||||||
|
```python
|
||||||
|
{{weight_7d_rolling_median}} # 7-Tage gleitender Median
|
||||||
|
{{weight_28d_trend_slope}} # 28-Tage Trend-Steigung (kg/Tag)
|
||||||
|
{{weight_90d_trend_slope}} # 90-Tage Trend-Steigung
|
||||||
|
{{weight_goal_progress_pct}} # Prozentuale Zielannäherung
|
||||||
|
{{weight_projection_days}} # Geschätzte Tage bis Zielgewicht
|
||||||
|
{{weight_loss_rate_weekly}} # kg/Woche (28d Mittel)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Körperzusammensetzung:**
|
||||||
|
```python
|
||||||
|
{{fm_current}} # Fettmasse aktuell (kg)
|
||||||
|
{{lbm_current}} # Magermasse aktuell (kg)
|
||||||
|
{{fm_28d_delta}} # FM Änderung 28 Tage (kg)
|
||||||
|
{{lbm_28d_delta}} # LBM Änderung 28 Tage (kg)
|
||||||
|
{{fm_90d_delta}} # FM Änderung 90 Tage
|
||||||
|
{{lbm_90d_delta}} # LBM Änderung 90 Tage
|
||||||
|
{{recomposition_score}} # 0-100 (FM↓ + LBM↑ = ideal)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Umfänge:**
|
||||||
|
```python
|
||||||
|
{{waist_to_hip_ratio}} # Taille/Hüfte Verhältnis
|
||||||
|
{{waist_to_height_ratio}} # Taille/Körpergröße (Gesundheitsmarker)
|
||||||
|
{{arm_asymmetry_pct}} # Links-Rechts Differenz %
|
||||||
|
{{leg_asymmetry_pct}} # Oberschenkel L-R Differenz
|
||||||
|
{{waist_28d_delta}} # Taillenumfang Änderung 28d
|
||||||
|
```
|
||||||
|
|
||||||
|
**Body Progress Score:**
|
||||||
|
```python
|
||||||
|
{{body_progress_score}} # 0-100 (zielabhängig gewichtet)
|
||||||
|
```
|
||||||
|
|
||||||
|
### 5.2 ERNÄHRUNG (15 neue Platzhalter)
|
||||||
|
|
||||||
|
**Energie & Bilanz:**
|
||||||
|
```python
|
||||||
|
{{kcal_7d_avg}} # Bereits vorhanden? Prüfen
|
||||||
|
{{kcal_28d_avg}} # 28-Tage Durchschnitt
|
||||||
|
{{kcal_estimated_tdee}} # Geschätzter Gesamtumsatz
|
||||||
|
{{kcal_balance_7d_avg}} # Durchschnittliche Bilanz 7d
|
||||||
|
{{kcal_balance_28d_avg}} # Durchschnittliche Bilanz 28d
|
||||||
|
{{energy_availability_status}} # "adequate" | "low" | "critical"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Protein:**
|
||||||
|
```python
|
||||||
|
{{protein_g_per_kg}} # Protein g/kg Körpergewicht
|
||||||
|
{{protein_g_per_kg_lbm}} # Protein g/kg Magermasse
|
||||||
|
{{protein_adequacy_score}} # 0-100 (Ziel: 1.6-2.2 g/kg)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Makros & Adhärenz:**
|
||||||
|
```python
|
||||||
|
{{carb_pct_7d_avg}} # % der Gesamtkalorien
|
||||||
|
{{fat_pct_7d_avg}} # % der Gesamtkalorien
|
||||||
|
{{macro_consistency_score}} # 0-100 (Regelmäßigkeit)
|
||||||
|
{{nutrition_adherence_score}} # 0-100 (Gesamtscore)
|
||||||
|
{{nutrition_days_7d}} # Erfasste Tage letzte 7d
|
||||||
|
{{nutrition_days_28d}} # Erfasste Tage letzte 28d
|
||||||
|
```
|
||||||
|
|
||||||
|
### 5.3 AKTIVITÄT (25 neue Platzhalter)
|
||||||
|
|
||||||
|
**Volumen:**
|
||||||
|
```python
|
||||||
|
{{activity_volume_7d_min}} # Gesamtminuten 7 Tage
|
||||||
|
{{activity_volume_28d_min}} # Gesamtminuten 28 Tage
|
||||||
|
{{activity_frequency_7d}} # Anzahl Sessions 7d
|
||||||
|
{{activity_frequency_28d}} # Anzahl Sessions 28d
|
||||||
|
{{activity_avg_duration_28d}} # Durchschn. Dauer pro Session
|
||||||
|
```
|
||||||
|
|
||||||
|
**Intensität:**
|
||||||
|
```python
|
||||||
|
{{activity_z1_pct}} # % Zeit in Zone 1 (7d)
|
||||||
|
{{activity_z2_pct}} # % Zeit in Zone 2
|
||||||
|
{{activity_z3_pct}} # % Zeit in Zone 3
|
||||||
|
{{activity_z4_pct}} # % Zeit in Zone 4
|
||||||
|
{{activity_z5_pct}} # % Zeit in Zone 5
|
||||||
|
{{activity_polarization_index}} # Polarisierung (Z1+Z2 vs Z4+Z5)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Qualität & Load:**
|
||||||
|
```python
|
||||||
|
{{activity_quality_avg_28d}} # Durchschn. Quality-Score
|
||||||
|
{{activity_load_7d}} # Interne Last (7d Summe)
|
||||||
|
{{activity_load_28d}} # Interne Last (28d Summe)
|
||||||
|
{{activity_monotony_28d}} # Last-Variabilität
|
||||||
|
{{activity_strain_28d}} # Load × Monotony
|
||||||
|
{{activity_acwr}} # Acute:Chronic Workload Ratio
|
||||||
|
```
|
||||||
|
|
||||||
|
**Fähigkeiten:**
|
||||||
|
```python
|
||||||
|
{{ability_strength_score}} # 0-100 (aus Training Types)
|
||||||
|
{{ability_endurance_score}} # 0-100
|
||||||
|
{{ability_mobility_score}} # 0-100
|
||||||
|
{{ability_skills_score}} # 0-100
|
||||||
|
{{ability_mindfulness_score}} # 0-100
|
||||||
|
{{ability_balance_score}} # 0-100 (wie ausgewogen?)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Goal Alignment:**
|
||||||
|
```python
|
||||||
|
{{activity_goal_alignment_score}} # 0-100 (zielabhängig)
|
||||||
|
{{rest_days_compliance}} # 0-100 (geplant vs. tatsächlich)
|
||||||
|
```
|
||||||
|
|
||||||
|
### 5.4 RECOVERY & GESUNDHEIT (12 neue Platzhalter)
|
||||||
|
|
||||||
|
**Baselines:**
|
||||||
|
```python
|
||||||
|
{{rhr_7d_baseline}} # 7-Tage Baseline Ruhepuls
|
||||||
|
{{rhr_28d_baseline}} # 28-Tage Baseline
|
||||||
|
{{hrv_7d_baseline}} # 7-Tage Baseline HRV
|
||||||
|
{{hrv_28d_baseline}} # 28-Tage Baseline
|
||||||
|
```
|
||||||
|
|
||||||
|
**Deltas & Trends:**
|
||||||
|
```python
|
||||||
|
{{rhr_vs_baseline_7d}} # Abweichung von Baseline (bpm)
|
||||||
|
{{hrv_vs_baseline_7d}} # Abweichung von Baseline (ms)
|
||||||
|
{{vo2max_trend_28d}} # VO2max Entwicklung
|
||||||
|
```
|
||||||
|
|
||||||
|
**Scores:**
|
||||||
|
```python
|
||||||
|
{{recovery_score}} # 0-100 (HRV, RHR, Schlaf)
|
||||||
|
{{recovery_score_confidence}} # 0-100 (Datenqualität)
|
||||||
|
{{sleep_regularity_index}} # Schlafregelmäßigkeit
|
||||||
|
{{sleep_debt_hours}} # Akkumulierte Schlafschuld
|
||||||
|
{{health_risk_score}} # 0-100 (Blutdruck, etc.)
|
||||||
|
```
|
||||||
|
|
||||||
|
### 5.5 KORRELATIONEN (8 neue Platzhalter)
|
||||||
|
|
||||||
|
```python
|
||||||
|
{{corr_energy_weight_lag}} # Bestes Lag Energie→Gewicht (Tage)
|
||||||
|
{{corr_energy_weight_r}} # Korrelationskoeffizient
|
||||||
|
{{corr_protein_lbm_r}} # Protein ↔ LBM Korrelation
|
||||||
|
{{corr_load_hrv_lag}} # Bestes Lag Load→HRV
|
||||||
|
{{corr_load_hrv_r}} # Korrelation
|
||||||
|
{{corr_sleep_rhr_r}} # Schlaf ↔ RHR Korrelation
|
||||||
|
{{plateau_detected}} # true|false (regelbasiert)
|
||||||
|
{{plateau_type}} # "weight_loss" | "strength" | etc.
|
||||||
|
```
|
||||||
|
|
||||||
|
### 5.6 META-PLATZHALTER (6 neue)
|
||||||
|
|
||||||
|
```python
|
||||||
|
{{goal_mode}} # "weight_loss" | "strength" | etc.
|
||||||
|
{{training_age_weeks}} # Trainingserfahrung
|
||||||
|
{{data_quality_score}} # 0-100 (Gesamtdatenqualität)
|
||||||
|
{{measurement_consistency}} # 0-100 (Messzeit-Konsistenz)
|
||||||
|
{{analysis_confidence}} # "high" | "medium" | "low"
|
||||||
|
{{analysis_timeframe}} # "7d" | "28d" | "90d"
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 6. Gesamt-Übersicht: Neue Platzhalter
|
||||||
|
|
||||||
|
| Kategorie | Anzahl | Beispiele |
|
||||||
|
|-----------|--------|-----------|
|
||||||
|
| KÖRPER | 18 | weight_28d_trend_slope, fm_28d_delta, recomposition_score |
|
||||||
|
| ERNÄHRUNG | 15 | protein_g_per_kg_lbm, nutrition_adherence_score, energy_availability_status |
|
||||||
|
| AKTIVITÄT | 25 | activity_quality_avg_28d, activity_strain_28d, ability_balance_score |
|
||||||
|
| RECOVERY | 12 | recovery_score, sleep_regularity_index, sleep_debt_hours |
|
||||||
|
| KORRELATIONEN | 8 | corr_energy_weight_lag, plateau_detected, corr_load_hrv_r |
|
||||||
|
| META | 6 | goal_mode, data_quality_score, analysis_confidence |
|
||||||
|
| **GESAMT** | **84** | **Neue Platzhalter aus Fachkonzept** |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 7. Strategische Roadmap-Empfehlung
|
||||||
|
|
||||||
|
### Phase 0: Fundament (JETZT)
|
||||||
|
**Ziel:** Berechnungs-Platzhalter implementieren
|
||||||
|
**Aufwand:** 16-20h
|
||||||
|
**Deliverables:**
|
||||||
|
- 84 neue Platzhalter in `placeholder_resolver.py`
|
||||||
|
- Baseline-Berechnungen (7d, 28d, 90d)
|
||||||
|
- Score-Algorithmen (Body Progress, Nutrition Adherence, Activity Goal Alignment, Recovery)
|
||||||
|
- Lag-Korrelations-Funktionen
|
||||||
|
- Confidence-Berechnung
|
||||||
|
|
||||||
|
**Issues zu erstellen:**
|
||||||
|
- #52: Baseline & Trend Calculations (Körper, Ernährung, Aktivität)
|
||||||
|
- #53: Score Algorithms (4 Haupt-Scores)
|
||||||
|
- #54: Correlation & Lag Analysis
|
||||||
|
- #55: Confidence & Data Quality Metrics
|
||||||
|
|
||||||
|
### Phase 1: Visualisierung (DANN)
|
||||||
|
**Ziel:** Charts nutzen die neuen Platzhalter
|
||||||
|
**Aufwand:** 12-16h
|
||||||
|
**Deliverables:**
|
||||||
|
- K1-K5 Charts (Körper)
|
||||||
|
- E1-E5 Charts (Ernährung)
|
||||||
|
- A1-A8 Charts (Aktivität)
|
||||||
|
- C1-C6 Charts (Korrelationen)
|
||||||
|
|
||||||
|
**Issues zu konsolidieren:**
|
||||||
|
- #26 erweitern zu "Comprehensive Chart System (K, E, A, C)"
|
||||||
|
- #27 erweitern zu "Correlation & Lag Analysis Charts"
|
||||||
|
|
||||||
|
### Phase 2: Regelbasierte Insights (DANACH)
|
||||||
|
**Ziel:** System wird Coach (nicht nur Datensammler)
|
||||||
|
**Aufwand:** 8-12h
|
||||||
|
**Deliverables:**
|
||||||
|
- Regelbasierte Hinweise ohne KI
|
||||||
|
- Plateau-Detektion
|
||||||
|
- Ziel-abhängige Interpretationen
|
||||||
|
- Warnungen (Gesundheit, Übertraining, Energieverfügbarkeit)
|
||||||
|
|
||||||
|
**Neue Issues:**
|
||||||
|
- #56: Rule-Based Recommendations Engine
|
||||||
|
- #57: Goal-Mode System & Interpretation
|
||||||
|
- #58: Health & Safety Warnings
|
||||||
|
|
||||||
|
### Phase 3: KI-Integration (SPÄTER)
|
||||||
|
**Ziel:** KI nutzt Scores + Rohdaten + Regeln
|
||||||
|
**Aufwand:** 6-8h
|
||||||
|
**Deliverables:**
|
||||||
|
- KI-Prompts nutzen neue Platzhalter
|
||||||
|
- Contextual AI Analysis (nutzt goal_mode)
|
||||||
|
- Multi-Faktor Insights
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 8. Aktions-Empfehlungen
|
||||||
|
|
||||||
|
### SOFORT (heute)
|
||||||
|
1. ✅ **Issues #26 und #27 NICHT einzeln implementieren**
|
||||||
|
2. ✅ **Neues Issue #52 erstellen:** Baseline & Trend Calculations
|
||||||
|
3. ✅ **Neues Issue #53 erstellen:** Score Algorithms
|
||||||
|
4. ✅ **Issue #26 umbennen/erweitern:** "Comprehensive Chart System (based on Fachkonzept)"
|
||||||
|
5. ✅ **Issue #27 umbennen/erweitern:** "Correlation & Lag Analysis (based on Fachkonzept)"
|
||||||
|
|
||||||
|
### DIESE WOCHE
|
||||||
|
6. ✅ **Implementierung starten:** Phase 0 - Platzhalter
|
||||||
|
7. ✅ **Dokumentation:** Mapping Fachkonzept → Code
|
||||||
|
8. ✅ **KI-Prompts vorbereiten:** Nutzen neue Platzhalter
|
||||||
|
|
||||||
|
### NÄCHSTE WOCHE
|
||||||
|
9. ✅ **Implementierung:** Phase 1 - Charts
|
||||||
|
10. ✅ **Testing:** Alle Scores & Berechnungen
|
||||||
|
11. ✅ **Production:** Deployment vorbereiten
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 9. Zusammenfassung: Transformation Data Collector → Active Coach
|
||||||
|
|
||||||
|
### Aktueller Stand
|
||||||
|
**Data Collector:**
|
||||||
|
- Daten werden erfasst
|
||||||
|
- Einfache Listen
|
||||||
|
- Basis-Statistiken
|
||||||
|
- KI-Analysen manuell angestoßen
|
||||||
|
|
||||||
|
### Ziel (nach Fachkonzept)
|
||||||
|
**Active Coach:**
|
||||||
|
- Daten werden **interpretiert**
|
||||||
|
- Trends & Baselines
|
||||||
|
- Scores & Confidence
|
||||||
|
- Regelbasierte Hinweise
|
||||||
|
- Ziel-abhängige Bewertung
|
||||||
|
- Proaktive Warnungen
|
||||||
|
- KI nutzt strukturierte Insights
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 10. Nächste Schritte
|
||||||
|
|
||||||
|
1. **Issues neu strukturieren** (heute)
|
||||||
|
2. **Platzhalter implementieren** (Phase 0, diese Woche)
|
||||||
|
3. **Charts implementieren** (Phase 1, nächste Woche)
|
||||||
|
4. **Regelbasierte Insights** (Phase 2, Woche danach)
|
||||||
|
5. **KI-Integration** (Phase 3, dann)
|
||||||
|
|
||||||
|
**Commit:** cd2609d
|
||||||
|
**Analysiert von:** Claude Code
|
||||||
|
**Basis:** Fachkonzept v2 (2086 Zeilen, 24.03.2026)
|
||||||
1058
docs/MEMBERSHIP_SYSTEM.md
Normal file
1058
docs/MEMBERSHIP_SYSTEM.md
Normal file
File diff suppressed because it is too large
Load Diff
460
docs/NEXT_STEPS_2026-03-26.md
Normal file
460
docs/NEXT_STEPS_2026-03-26.md
Normal file
|
|
@ -0,0 +1,460 @@
|
||||||
|
# Nächste Schritte nach Phase 0a
|
||||||
|
|
||||||
|
**Stand:** 26. März 2026, nach Completion von Phase 0a (Goal System)
|
||||||
|
**Aktueller Branch:** `develop`
|
||||||
|
**Deployed:** `dev.mitai.jinkendo.de`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Aktueller Stand ✅
|
||||||
|
|
||||||
|
### Abgeschlossen
|
||||||
|
- ✅ **Phase 0a:** Minimal Goal System (Strategic + Tactical)
|
||||||
|
- Migration 022, goals.py Router, GoalsPage UI
|
||||||
|
- Navigation von Dashboard + Analysis
|
||||||
|
- Mobile-friendly Design
|
||||||
|
- **Basis vorhanden für 120+ goal-aware Platzhalter**
|
||||||
|
|
||||||
|
### Offene Gitea Issues
|
||||||
|
- 🔲 **#49:** Prompt-Zuordnung zu Verlaufsseiten (6-8h)
|
||||||
|
- 🔲 **#47:** Wertetabelle Optimierung (4-6h)
|
||||||
|
- 🔲 **#46:** KI Prompt-Ersteller (später)
|
||||||
|
- 🔲 **#45:** KI Prompt-Optimierer (später)
|
||||||
|
- 🔲 **#43, #42:** Enhanced Debug UI (später)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Option A: Issue #49 - Prompt Page Assignment ⚡
|
||||||
|
|
||||||
|
**Aufwand:** 6-8 Stunden
|
||||||
|
**Priorität:** Medium
|
||||||
|
**Typ:** UX Enhancement
|
||||||
|
**Labels:** feature, ux, enhancement
|
||||||
|
|
||||||
|
### Beschreibung
|
||||||
|
KI-Prompts flexibel auf verschiedenen Verlaufsseiten verfügbar machen. Jeder Prompt kann auf mehreren Seiten gleichzeitig angeboten werden (Mehrfachauswahl).
|
||||||
|
|
||||||
|
### Problem
|
||||||
|
**Aktuell:**
|
||||||
|
- Prompts nur über zentrale Analyse-Seite verfügbar
|
||||||
|
- Kein kontextbezogener Zugriff auf relevante Analysen
|
||||||
|
- User muss immer zur Analyse-Seite navigieren
|
||||||
|
|
||||||
|
**Beispiel-Szenario:**
|
||||||
|
```
|
||||||
|
User ist auf: Gewicht → Verlauf
|
||||||
|
Will: Gewichtstrend analysieren
|
||||||
|
Muss: Zur Analyse-Seite → Prompt auswählen → Zurück
|
||||||
|
```
|
||||||
|
|
||||||
|
**Wünschenswert:**
|
||||||
|
```
|
||||||
|
User ist auf: Gewicht → Verlauf
|
||||||
|
Sieht: "🤖 KI-Analyse" Widget mit relevanten Prompts
|
||||||
|
Kann: Direkt "Gewichtstrend-Analyse" starten
|
||||||
|
```
|
||||||
|
|
||||||
|
### Technische Umsetzung
|
||||||
|
|
||||||
|
**Backend (2h):**
|
||||||
|
```sql
|
||||||
|
-- Migration 023
|
||||||
|
ALTER TABLE ai_prompts ADD COLUMN available_on JSONB DEFAULT '["analysis"]';
|
||||||
|
|
||||||
|
-- Beispiel:
|
||||||
|
{
|
||||||
|
"slug": "weight_trend",
|
||||||
|
"available_on": ["analysis", "weight_history"]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**API:**
|
||||||
|
```python
|
||||||
|
# Neuer Endpoint
|
||||||
|
GET /api/prompts/for-page/{page_slug}
|
||||||
|
→ Returns: List[Prompt] where available_on contains page_slug
|
||||||
|
|
||||||
|
# CRUD erweitern
|
||||||
|
PUT /api/prompts/unified/{id}
|
||||||
|
→ Body: {..., "available_on": ["analysis", "weight_history"]}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Frontend (4h):**
|
||||||
|
```javascript
|
||||||
|
// Wiederverwendbare Komponente
|
||||||
|
<PagePrompts pageSlug="weight_history" />
|
||||||
|
|
||||||
|
// UnifiedPromptModal erweitern
|
||||||
|
const PAGE_OPTIONS = [
|
||||||
|
{ value: 'analysis', label: '📊 Analyse (Hauptseite)', default: true },
|
||||||
|
{ value: 'weight_history', label: '⚖️ Gewicht → Verlauf' },
|
||||||
|
{ value: 'nutrition_history', label: '🍎 Ernährung → Verlauf' },
|
||||||
|
// ... 9 Optionen total
|
||||||
|
]
|
||||||
|
|
||||||
|
// Multi-select checkboxes in Prompt-Editor
|
||||||
|
```
|
||||||
|
|
||||||
|
**Integration in Verlaufsseiten (2h):**
|
||||||
|
- WeightPage, NutritionPage, ActivityPage erweitern
|
||||||
|
- Widget unterhalb Charts einfügen
|
||||||
|
- Modal für Inline-Analyse
|
||||||
|
|
||||||
|
### Vorteile
|
||||||
|
- ✅ Schneller Nutzen (UX-Verbesserung sofort sichtbar)
|
||||||
|
- ✅ Nutzt bestehendes Unified Prompt System (Issue #28)
|
||||||
|
- ✅ Relativ einfache Implementierung
|
||||||
|
- ✅ Bereitet vor für Phase 0b (neue Platzhalter dann sofort auf allen Seiten nutzbar)
|
||||||
|
|
||||||
|
### Nachteile
|
||||||
|
- ⚠️ Verzögert strategische Tiefe (goal-aware Analysen)
|
||||||
|
- ⚠️ Erst sinnvoll wenn mehr Prompts existieren
|
||||||
|
|
||||||
|
**Dokumentation:** Siehe `docs/issues/issue-51-prompt-page-assignment.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Option B: Phase 0b - Goal-Aware Placeholders 🎯
|
||||||
|
|
||||||
|
**Aufwand:** 16-20 Stunden
|
||||||
|
**Priorität:** High (strategisch kritisch)
|
||||||
|
**Typ:** Core Feature
|
||||||
|
**Labels:** feature, ai, goal-system
|
||||||
|
|
||||||
|
### Beschreibung
|
||||||
|
Implementierung von 120+ neuen KI-Platzhaltern die `goal_mode` berücksichtigen. Verwandelt System von "Datensammler" zu "intelligentem Coach".
|
||||||
|
|
||||||
|
### Problem
|
||||||
|
**Aktuell:**
|
||||||
|
- Ziele existieren, aber KI-Analysen ignorieren sie
|
||||||
|
- Gleiche Daten werden für alle goal_modes gleich interpretiert
|
||||||
|
- Keine goal-spezifischen Score-Berechnungen
|
||||||
|
|
||||||
|
**Beispiel:**
|
||||||
|
```python
|
||||||
|
# Gleiche Messung: -5kg FM, -2kg LBM
|
||||||
|
# Aktuell: Generischer Score (z.B. 50/100)
|
||||||
|
|
||||||
|
# Mit Phase 0b:
|
||||||
|
goal_mode = "weight_loss" → 78/100 (FM↓ gut!)
|
||||||
|
goal_mode = "strength" → 32/100 (LBM↓ Katastrophe!)
|
||||||
|
goal_mode = "recomposition" → 65/100 (beides relevant)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Technische Umsetzung
|
||||||
|
|
||||||
|
**1. Placeholder Functions (8-10h):**
|
||||||
|
|
||||||
|
**Kategorie: KÖRPER (18 neue):**
|
||||||
|
```python
|
||||||
|
def weight_7d_rolling_median(profile_id, goal_mode):
|
||||||
|
"""Rolling median statt avg für Stabilität"""
|
||||||
|
|
||||||
|
def weight_28d_trend_slope(profile_id, goal_mode):
|
||||||
|
"""Linear regression slope - kg/Woche"""
|
||||||
|
|
||||||
|
def fm_28d_delta(profile_id, goal_mode):
|
||||||
|
"""Fettmasse-Veränderung 28 Tage"""
|
||||||
|
|
||||||
|
def lbm_28d_delta(profile_id, goal_mode):
|
||||||
|
"""Magermasse-Veränderung 28 Tage"""
|
||||||
|
|
||||||
|
def recomposition_score(profile_id, goal_mode):
|
||||||
|
"""FM↓ + LBM↑ Balance-Score"""
|
||||||
|
# Nur relevant wenn goal_mode = "recomposition"
|
||||||
|
|
||||||
|
def waist_to_hip_ratio(profile_id):
|
||||||
|
"""WHR - Bauchfettverteilung"""
|
||||||
|
|
||||||
|
def waist_to_height_ratio(profile_id):
|
||||||
|
"""WHtR - Gesundheitsrisiko"""
|
||||||
|
```
|
||||||
|
|
||||||
|
**Kategorie: ERNÄHRUNG (15 neue):**
|
||||||
|
```python
|
||||||
|
def protein_g_per_kg(profile_id, goal_mode):
|
||||||
|
"""Protein pro kg Körpergewicht"""
|
||||||
|
# Target abhängig von goal_mode:
|
||||||
|
# strength: 2.0-2.2g/kg
|
||||||
|
# weight_loss: 1.8-2.0g/kg
|
||||||
|
# endurance: 1.4-1.6g/kg
|
||||||
|
|
||||||
|
def protein_g_per_kg_lbm(profile_id):
|
||||||
|
"""Protein pro kg Magermasse (präziser)"""
|
||||||
|
|
||||||
|
def nutrition_adherence_score(profile_id, goal_mode):
|
||||||
|
"""Wie gut hält User seine Makro-Ziele ein?"""
|
||||||
|
# Ziele abhängig von goal_mode
|
||||||
|
|
||||||
|
def energy_availability_status(profile_id):
|
||||||
|
"""kcal - activity_kcal - BMR = verfügbare Energie"""
|
||||||
|
# RED-S Warnung wenn < 30 kcal/kg LBM
|
||||||
|
```
|
||||||
|
|
||||||
|
**Kategorie: AKTIVITÄT (25 neue):**
|
||||||
|
```python
|
||||||
|
def activity_quality_avg_28d(profile_id):
|
||||||
|
"""Durchschnittliche Trainingsqualität"""
|
||||||
|
|
||||||
|
def activity_strain_28d(profile_id):
|
||||||
|
"""Kumulierte Belastung (Monotonie-Detektion)"""
|
||||||
|
|
||||||
|
def activity_monotony_28d(profile_id):
|
||||||
|
"""Variation im Training (Plateaus erkennen)"""
|
||||||
|
|
||||||
|
def ability_balance_score(profile_id, goal_mode):
|
||||||
|
"""Balance zwischen Fähigkeiten (Strength/Cardio/Mobility)"""
|
||||||
|
# Gewichtung abhängig von goal_mode
|
||||||
|
```
|
||||||
|
|
||||||
|
**Kategorie: RECOVERY (12 neue):**
|
||||||
|
```python
|
||||||
|
def recovery_score(profile_id):
|
||||||
|
"""
|
||||||
|
Kombiniert: RHR + HRV + Sleep Quality + Rest Days
|
||||||
|
Score: 0-100
|
||||||
|
"""
|
||||||
|
|
||||||
|
def sleep_regularity_index(profile_id):
|
||||||
|
"""Wie regelmäßig sind Schlafzeiten? (0-100)"""
|
||||||
|
|
||||||
|
def sleep_debt_hours(profile_id):
|
||||||
|
"""Kumulierte Schlafdifferenz zu Ziel"""
|
||||||
|
```
|
||||||
|
|
||||||
|
**Kategorie: KORRELATIONEN (8 neue):**
|
||||||
|
```python
|
||||||
|
def corr_energy_weight_lag(profile_id):
|
||||||
|
"""
|
||||||
|
Korrelation Kaloriendefizit → Gewicht
|
||||||
|
Mit Lag-Analysis (verzögerte Effekte)
|
||||||
|
Confidence-Score basierend auf Datenmenge
|
||||||
|
"""
|
||||||
|
|
||||||
|
def plateau_detected(profile_id):
|
||||||
|
"""
|
||||||
|
Boolean: Gewicht stagniert trotz Defizit?
|
||||||
|
Trigger für Interventionen
|
||||||
|
"""
|
||||||
|
```
|
||||||
|
|
||||||
|
**Kategorie: META (6 neue):**
|
||||||
|
```python
|
||||||
|
def goal_mode(profile_id):
|
||||||
|
"""Aktueller goal_mode (für Prompts verfügbar)"""
|
||||||
|
|
||||||
|
def data_quality_score(profile_id):
|
||||||
|
"""Wie vollständig/konsistent sind Daten? (0-100)"""
|
||||||
|
|
||||||
|
def profile_age_years(profile_id):
|
||||||
|
"""Alter für altersabhängige Normen"""
|
||||||
|
```
|
||||||
|
|
||||||
|
**2. Score-Gewichtung (4-6h):**
|
||||||
|
|
||||||
|
```python
|
||||||
|
# backend/score_calculator.py (NEU)
|
||||||
|
|
||||||
|
SCORE_WEIGHTS = {
|
||||||
|
"weight_loss": {
|
||||||
|
"body_progress": 0.30, # FM↓ wichtig
|
||||||
|
"nutrition": 0.25, # Defizit wichtig
|
||||||
|
"training_quality": 0.15, # Moderat wichtig
|
||||||
|
"recovery": 0.15, # Moderat wichtig
|
||||||
|
"adherence": 0.15 # Konsistenz wichtig
|
||||||
|
},
|
||||||
|
"strength": {
|
||||||
|
"body_progress": 0.35, # LBM↑ KRITISCH
|
||||||
|
"nutrition": 0.30, # Surplus + Protein
|
||||||
|
"training_quality": 0.25, # Progressive Overload
|
||||||
|
"recovery": 0.10 # Weniger wichtig
|
||||||
|
},
|
||||||
|
"endurance": {
|
||||||
|
"training_quality": 0.40, # VO2Max, Pace wichtig
|
||||||
|
"recovery": 0.25, # Übertraining vermeiden
|
||||||
|
"body_progress": 0.15, # Gewicht sekundär
|
||||||
|
"nutrition": 0.20 # Energie-Verfügbarkeit
|
||||||
|
},
|
||||||
|
# ... recomposition, health
|
||||||
|
}
|
||||||
|
|
||||||
|
def calculate_overall_score(profile_id, goal_mode):
|
||||||
|
"""Berechnet Gesamt-Score basierend auf goal_mode Gewichtung"""
|
||||||
|
weights = SCORE_WEIGHTS[goal_mode]
|
||||||
|
|
||||||
|
scores = {
|
||||||
|
"body_progress": calculate_body_progress_score(profile_id, goal_mode),
|
||||||
|
"nutrition": calculate_nutrition_score(profile_id, goal_mode),
|
||||||
|
"training_quality": calculate_training_score(profile_id, goal_mode),
|
||||||
|
"recovery": calculate_recovery_score(profile_id),
|
||||||
|
"adherence": calculate_adherence_score(profile_id, goal_mode)
|
||||||
|
}
|
||||||
|
|
||||||
|
overall = sum(scores[key] * weights[key] for key in weights)
|
||||||
|
return {
|
||||||
|
"overall": round(overall, 1),
|
||||||
|
"breakdown": scores,
|
||||||
|
"weights": weights
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**3. Baseline-Berechnungen (2-3h):**
|
||||||
|
|
||||||
|
```python
|
||||||
|
def calculate_baselines(profile_id):
|
||||||
|
"""
|
||||||
|
Berechnet persönliche Referenzwerte:
|
||||||
|
- 7d baseline (kurzfristig)
|
||||||
|
- 28d baseline (mittelfristig)
|
||||||
|
- 90d baseline (langfristig)
|
||||||
|
|
||||||
|
Für: Gewicht, RHR, HRV, Kalorien, Protein, etc.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def detect_anomalies(profile_id, metric, value):
|
||||||
|
"""
|
||||||
|
Ist Wert außerhalb von ±2 SD vom Baseline?
|
||||||
|
→ Warnung für User
|
||||||
|
"""
|
||||||
|
```
|
||||||
|
|
||||||
|
**4. Integration in Prompts (1-2h):**
|
||||||
|
|
||||||
|
```python
|
||||||
|
# Beispiel Prompt-Template:
|
||||||
|
"""
|
||||||
|
Du bist ein KI-Coach für {{goal_mode}} Training.
|
||||||
|
|
||||||
|
Aktueller Status:
|
||||||
|
- Gewichtstrend: {{weight_28d_trend_slope}} kg/Woche
|
||||||
|
- Fettmasse Δ28d: {{fm_28d_delta}} kg
|
||||||
|
- Magermasse Δ28d: {{lbm_28d_delta}} kg
|
||||||
|
- Rekompositions-Score: {{recomposition_score}}/100
|
||||||
|
|
||||||
|
Ernährung:
|
||||||
|
- Protein/kg: {{protein_g_per_kg}} g/kg (Ziel: {{protein_target_for_mode}})
|
||||||
|
- Adherence: {{nutrition_adherence_score}}/100
|
||||||
|
|
||||||
|
Training:
|
||||||
|
- Qualität (28d): {{activity_quality_avg_28d}}/5.0
|
||||||
|
- Monotonie: {{activity_monotony_28d}} (Warnung bei >2.0)
|
||||||
|
|
||||||
|
Recovery:
|
||||||
|
- Recovery Score: {{recovery_score}}/100
|
||||||
|
- Schlafschuld: {{sleep_debt_hours}}h
|
||||||
|
|
||||||
|
Gesamt-Score ({{goal_mode}}-optimiert): {{overall_score}}/100
|
||||||
|
|
||||||
|
Analyse den Fortschritt aus Sicht eines {{goal_mode}} Ziels...
|
||||||
|
"""
|
||||||
|
```
|
||||||
|
|
||||||
|
### Vorteile
|
||||||
|
- ✅ Größter strategischer Impact (System wird intelligent)
|
||||||
|
- ✅ Ziele werden tatsächlich genutzt (nicht nur Display)
|
||||||
|
- ✅ Basis für alle zukünftigen Features
|
||||||
|
- ✅ Automatische Trainingsphasen-Erkennung möglich
|
||||||
|
|
||||||
|
### Nachteile
|
||||||
|
- ⚠️ Hoher Aufwand (16-20h)
|
||||||
|
- ⚠️ Komplexe Logik (viel Testing nötig)
|
||||||
|
- ⚠️ Erfordert mehr Daten für sinnvolle Scores
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Option C: Issue #47 - Value Table Refinement 🔬
|
||||||
|
|
||||||
|
**Aufwand:** 4-6 Stunden
|
||||||
|
**Priorität:** Low (Polishing)
|
||||||
|
**Typ:** Enhancement
|
||||||
|
|
||||||
|
### Beschreibung
|
||||||
|
Wertetabelle übersichtlicher gestalten - Normal-Modus nur Einzelwerte, Experten-Modus mit Stage-Rohdaten.
|
||||||
|
|
||||||
|
### Vorteile
|
||||||
|
- ✅ Bessere UX für Value Table
|
||||||
|
- ✅ Weniger Überforderung im Normal-Modus
|
||||||
|
|
||||||
|
### Nachteile
|
||||||
|
- ⚠️ Kosmetisch, kein funktionaler Impact
|
||||||
|
- ⚠️ Besser warten bis Phase 0b (dann 120+ Platzhalter)
|
||||||
|
|
||||||
|
**Empfehlung:** Später (nach Phase 0b)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Empfehlung 🎯
|
||||||
|
|
||||||
|
### Szenario 1: "Quick Wins first"
|
||||||
|
```
|
||||||
|
1. Issue #49 - Prompt Assignment (6-8h)
|
||||||
|
→ Bessere UX sofort
|
||||||
|
|
||||||
|
2. Phase 0b - Goal-Aware Placeholders (16-20h)
|
||||||
|
→ Neue Platzhalter profitieren von Page Assignment
|
||||||
|
→ Volle Power mit beiden Features
|
||||||
|
|
||||||
|
Total: 22-28h
|
||||||
|
```
|
||||||
|
|
||||||
|
### Szenario 2: "Strategic Depth first"
|
||||||
|
```
|
||||||
|
1. Phase 0b - Goal-Aware Placeholders (16-20h)
|
||||||
|
→ System wird intelligent
|
||||||
|
|
||||||
|
2. Issue #49 - Prompt Assignment (6-8h)
|
||||||
|
→ Intelligente Prompts dann auf allen Seiten
|
||||||
|
|
||||||
|
Total: 22-28h
|
||||||
|
```
|
||||||
|
|
||||||
|
### Persönliche Empfehlung: **Szenario 1**
|
||||||
|
|
||||||
|
**Begründung:**
|
||||||
|
- Issue #49 ist relativ einfach und bringt sofort UX-Nutzen
|
||||||
|
- Nutzt bestehendes Unified Prompt System optimal
|
||||||
|
- Phase 0b profitiert dann von besserer Navigation
|
||||||
|
- User kann neue Platzhalter (Phase 0b) direkt auf relevanten Seiten nutzen
|
||||||
|
- Psychologisch: Zwei Erfolgserlebnisse statt einem großen
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Nächste Session: Action Items
|
||||||
|
|
||||||
|
**Falls Issue #49 gewählt:**
|
||||||
|
1. [ ] Migration 023 erstellen (available_on JSONB)
|
||||||
|
2. [ ] Backend: `/api/prompts/for-page/{slug}` Endpoint
|
||||||
|
3. [ ] Backend: CRUD erweitern (available_on in PUT)
|
||||||
|
4. [ ] Frontend: PAGE_OPTIONS in UnifiedPromptModal
|
||||||
|
5. [ ] Frontend: PagePrompts Komponente (wiederverwendbar)
|
||||||
|
6. [ ] Integration: WeightPage, NutritionPage, ActivityPage
|
||||||
|
7. [ ] Testing: Multi-select, Modal-Inline-Analyse
|
||||||
|
|
||||||
|
**Falls Phase 0b gewählt:**
|
||||||
|
1. [ ] Placeholder-Funktionen kategorieweise implementieren (KÖRPER → ERNÄHRUNG → AKTIVITÄT → RECOVERY → KORRELATIONEN → META)
|
||||||
|
2. [ ] Score-Gewichtung pro goal_mode definieren
|
||||||
|
3. [ ] Backend: score_calculator.py erstellen
|
||||||
|
4. [ ] Baseline-Berechnungen implementieren
|
||||||
|
5. [ ] Integration in bestehende Prompts
|
||||||
|
6. [ ] Testing mit verschiedenen goal_modes
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Metriken & Timeline
|
||||||
|
|
||||||
|
**Geschätzte Timeline (bei 4h/Tag Entwicklung):**
|
||||||
|
|
||||||
|
| Szenario | Dauer | Fertig bis |
|
||||||
|
|----------|-------|------------|
|
||||||
|
| Issue #49 | 1.5-2 Tage | ~28.03.2026 |
|
||||||
|
| Phase 0b | 4-5 Tage | ~31.03.2026 |
|
||||||
|
| Szenario 1 (Quick Wins first) | 5.5-7 Tage | ~02.04.2026 |
|
||||||
|
| Szenario 2 (Strategic first) | 5.5-7 Tage | ~02.04.2026 |
|
||||||
|
|
||||||
|
**Bei 8h/Tag Entwicklung:** Timeline halbiert sich (~01.04.2026)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Erstellt:** 26. März 2026
|
||||||
|
**Status:** Aktiv - Wartet auf Entscheidung
|
||||||
|
**Nächste Aktualisierung:** Nach Completion von gewähltem Path
|
||||||
272
docs/STATUS_2026-03-27.md
Normal file
272
docs/STATUS_2026-03-27.md
Normal file
|
|
@ -0,0 +1,272 @@
|
||||||
|
# Projekt-Status: 27. März 2026
|
||||||
|
|
||||||
|
**Branch:** `develop`
|
||||||
|
**Letzte Version:** v0.9g+ (vor Release v0.9h)
|
||||||
|
**Deployment:** dev.mitai.jinkendo.de
|
||||||
|
**Nächster Meilenstein:** Release v0.9h → Code Splitting → Phase 0b
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 Aktueller Zustand: BEREIT FÜR RELEASE v0.9h
|
||||||
|
|
||||||
|
### Was ist fertig? ✅
|
||||||
|
|
||||||
|
#### Goals System (Phase 0a + Dynamic Focus Areas v2.0)
|
||||||
|
- ✅ **Migration 022:** goals, training_phases, fitness_tests tables
|
||||||
|
- ✅ **Migration 027-032:** Dynamic Focus Areas
|
||||||
|
- 26 Basis-Bereiche in 7 Kategorien (user-extensible)
|
||||||
|
- Many-to-Many: Goals ↔ Focus Areas mit contribution weights
|
||||||
|
- User preferences mit dynamischen Gewichtungen
|
||||||
|
- ✅ **Backend:**
|
||||||
|
- `routers/goals.py` - CRUD für Goals (~1200 Zeilen, **needs splitting**)
|
||||||
|
- `routers/focus_areas.py` - Dynamic system CRUD (~350 Zeilen)
|
||||||
|
- ✅ **Frontend:**
|
||||||
|
- `GoalsPage.jsx` - Strategic layer (~1180 Zeilen, **needs component extraction**)
|
||||||
|
- `CustomGoalsPage.jsx` - Tactical daily entry
|
||||||
|
- `AdminFocusAreasPage.jsx` - Admin UI für Focus Areas
|
||||||
|
- ✅ **Navigation:** Dashboard + Analysis integriert
|
||||||
|
|
||||||
|
#### Bug Fixes (alle committed, deployed pending)
|
||||||
|
- ✅ Focus area contributions speichern (fehlte in API payload)
|
||||||
|
- ✅ Filtering: Nur gewichtete Focus Areas im Ziel-Formular
|
||||||
|
- ✅ Vitals baseline endpoint (parameter mismatch behoben)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📋 Gitea Issues - Status
|
||||||
|
|
||||||
|
### Geschlossen ✅
|
||||||
|
- ✅ **#50:** Goals System v1 (Phase 0a)
|
||||||
|
- ✅ **#51:** Dynamic Focus Areas v2.0
|
||||||
|
- ✅ **#48:** Flexibles KI Prompt System
|
||||||
|
- ✅ **#44:** BUG - Analysen löschen
|
||||||
|
- ✅ **#28:** AI-Prompts Flexibilisierung
|
||||||
|
- ⏳ **#25:** Goals System (sollte geschlossen werden - ist fertig!)
|
||||||
|
|
||||||
|
### Offen - Priorisiert 🔲
|
||||||
|
- 🔲 **#52:** NEW - Blutdruck-Ziele mit dual targets (systolic/diastolic) - 2-3h
|
||||||
|
- 🔲 **#49:** Prompt-Zuordnung zu Verlaufsseiten (6-8h, Quick Win)
|
||||||
|
- 🔲 **#47:** Wertetabelle Optimierung (4-6h, nach Phase 0b)
|
||||||
|
- 🔲 **#30:** Responsive UI - Desktop Sidebar (8-10h)
|
||||||
|
- 🔲 **#29:** Abilities-Matrix UI (6-8h)
|
||||||
|
|
||||||
|
### Offen - Backlog 📦
|
||||||
|
- 📦 #46, #45: KI Prompt-Ersteller/-Optimierer (später)
|
||||||
|
- 📦 #43, #42: Enhanced Debug UI (später)
|
||||||
|
- 📦 #40: Logout-Button (kosmetisch)
|
||||||
|
- 📦 #39: Usage-Badges Dashboard (kosmetisch)
|
||||||
|
- 📦 #27: Korrelationen erweitern (Phase 2)
|
||||||
|
- 📦 #26: Charts erweitern (Phase 1)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🚀 Nächste Schritte (User-Plan APPROVED)
|
||||||
|
|
||||||
|
### Phase 1: Testing + Release (2-3 Tage)
|
||||||
|
```
|
||||||
|
Tag 1-2: Umfassende Tests des Goals-Moduls
|
||||||
|
[ ] Goal Mode wechseln
|
||||||
|
[ ] Focus Areas gewichten (alle 26 testen)
|
||||||
|
[ ] Ziele erstellen mit focus_contributions
|
||||||
|
[ ] Ziele bearbeiten (contributions ändern)
|
||||||
|
[ ] Ist-Werte eintragen (CustomGoalsPage)
|
||||||
|
[ ] Progress Modal testen
|
||||||
|
[ ] Admin Focus Areas CRUD
|
||||||
|
[ ] Edge Cases (leere Daten, Extremwerte)
|
||||||
|
[ ] Vitals baseline entry (Ruhepuls) - nach neuem Deployment
|
||||||
|
|
||||||
|
Tag 3: Deploy + Release v0.9h
|
||||||
|
[ ] Final commit & push
|
||||||
|
[ ] Merge develop → main (PR in Gitea)
|
||||||
|
[ ] Tag v0.9h in Git
|
||||||
|
[ ] Deploy to Production
|
||||||
|
[ ] Smoke Tests
|
||||||
|
[ ] Release Notes schreiben
|
||||||
|
```
|
||||||
|
|
||||||
|
### Phase 2: Code Splitting (1-2 Tage)
|
||||||
|
```
|
||||||
|
Tag 3-4: Backend Router Split
|
||||||
|
[ ] goals.py → 5 separate Router
|
||||||
|
- goals.py (core CRUD ~300 Zeilen)
|
||||||
|
- goal_types.py (~200 Zeilen)
|
||||||
|
- goal_progress.py (~150 Zeilen)
|
||||||
|
- training_phases.py (~150 Zeilen)
|
||||||
|
- fitness_tests.py (~150 Zeilen)
|
||||||
|
[ ] Imports anpassen
|
||||||
|
[ ] main.py: 5 neue Router registrieren
|
||||||
|
[ ] Optional: insights.py prüfen (wenn >800 Zeilen)
|
||||||
|
|
||||||
|
Tag 5: Testing nach Split
|
||||||
|
[ ] API-Endpoints vollständig testen
|
||||||
|
[ ] Frontend funktioniert
|
||||||
|
[ ] Deployment auf dev
|
||||||
|
```
|
||||||
|
|
||||||
|
### Phase 3: Phase 0b - Goal-Aware Placeholders (4 Tage)
|
||||||
|
```
|
||||||
|
Aufwand: 16-20h
|
||||||
|
Neue Platzhalter: 120+ Funktionen
|
||||||
|
|
||||||
|
Tag 6: KÖRPER + ERNÄHRUNG (40 Funktionen)
|
||||||
|
- weight_7d_rolling_median, weight_28d_trend_slope
|
||||||
|
- fm_28d_delta, lbm_28d_delta, recomposition_score
|
||||||
|
- protein_g_per_kg, protein_g_per_kg_lbm
|
||||||
|
- nutrition_adherence_score, energy_availability
|
||||||
|
|
||||||
|
Tag 7: AKTIVITÄT + RECOVERY (37 Funktionen)
|
||||||
|
- activity_quality_avg_28d, activity_strain_28d
|
||||||
|
- activity_monotony_28d, ability_balance_score
|
||||||
|
- recovery_score, sleep_regularity_index, sleep_debt_hours
|
||||||
|
|
||||||
|
Tag 8: KORRELATIONEN + META + Scoring (20 Funktionen + System)
|
||||||
|
- corr_energy_weight_lag, plateau_detected
|
||||||
|
- goal_mode, data_quality_score, profile_age_years
|
||||||
|
- Score-Gewichtung pro goal_mode implementieren
|
||||||
|
|
||||||
|
Tag 9: Integration + Testing
|
||||||
|
- Prompts aktualisieren mit neuen Platzhaltern
|
||||||
|
- Testing mit verschiedenen goal_modes
|
||||||
|
- Dokumentation
|
||||||
|
|
||||||
|
Tag 10: Deploy v0.10a
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📊 Code-Metriken (Stand 27.03.2026)
|
||||||
|
|
||||||
|
### Große Dateien (Splitting-Kandidaten)
|
||||||
|
```
|
||||||
|
Backend:
|
||||||
|
- routers/goals.py ~1200 Zeilen ⚠️ SPLIT NEEDED
|
||||||
|
- routers/insights.py ~800 Zeilen (prüfen)
|
||||||
|
- routers/focus_areas.py ~350 Zeilen ✓ OK
|
||||||
|
|
||||||
|
Frontend:
|
||||||
|
- pages/GoalsPage.jsx ~1180 Zeilen ⚠️ Component extraction möglich
|
||||||
|
- pages/AdminPanel.jsx ~700 Zeilen ✓ OK
|
||||||
|
- pages/CustomGoalsPage.jsx ~350 Zeilen ✓ OK
|
||||||
|
```
|
||||||
|
|
||||||
|
### Migrations Status
|
||||||
|
```
|
||||||
|
Letzte Migration: 032_user_focus_area_weights.sql
|
||||||
|
Nächste: 033_dual_target_fields.sql (BP goals, Issue #52)
|
||||||
|
|
||||||
|
Alle Migrationen 001-032 erfolgreich angewandt auf dev ✅
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🔧 Technische Schulden
|
||||||
|
|
||||||
|
### Hoch-Priorität
|
||||||
|
1. **Code Splitting:** goals.py zu groß für Context Window
|
||||||
|
2. **Component Extraction:** GoalsPage.jsx komponenten-basiert
|
||||||
|
3. **Testing Suite:** Automatisierte Tests fehlen komplett
|
||||||
|
|
||||||
|
### Mittel-Priorität
|
||||||
|
4. **Responsive UI:** Desktop-Sidebar fehlt (Issue #30)
|
||||||
|
5. **Error Handling:** Mehr defensive Programmierung nötig
|
||||||
|
6. **API Documentation:** Swagger/OpenAPI fehlt
|
||||||
|
|
||||||
|
### Niedrig-Priorität
|
||||||
|
7. **Type Hints:** Mehr Python Type Annotations
|
||||||
|
8. **Performance:** Einige N+1 Queries optimieren
|
||||||
|
9. **Caching:** Redis für häufige Abfragen
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📚 Dokumentation - Status
|
||||||
|
|
||||||
|
### Aktuell ✅
|
||||||
|
- ✅ `CLAUDE.md` - Hauptdokumentation
|
||||||
|
- ✅ `docs/STATUS_2026-03-27.md` - Dieser Status (NEU)
|
||||||
|
- ✅ `docs/NEXT_STEPS_2026-03-26.md` - Roadmap Phase 0b
|
||||||
|
- ✅ `docs/issues/issue-50-phase-0a-goal-system.md` - Phase 0a abgeschlossen
|
||||||
|
- ✅ `docs/issues/issue-52-blood-pressure-dual-targets.md` - Neue Issue (NEU)
|
||||||
|
- ✅ `.claude/docs/functional/AI_PROMPTS.md` - Prompt-System komplett
|
||||||
|
- ✅ `.claude/docs/technical/MEMBERSHIP_SYSTEM.md` - Feature-Enforcement
|
||||||
|
|
||||||
|
### Zu aktualisieren 📝
|
||||||
|
- 📝 `CLAUDE.md` - v0.9g/h Updates eintragen
|
||||||
|
- 📝 `.claude/docs/ROADMAP.md` - Phase 0a als ✅ markieren
|
||||||
|
- 📝 `.claude/library/` - Nach v0.9h Release aktualisieren
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 Decision Points
|
||||||
|
|
||||||
|
### Entschieden ✅
|
||||||
|
1. **User-Plan APPROVED:** Testing → Release → Split → Phase 0b
|
||||||
|
2. **Code Splitting:** Backend Router zuerst, Frontend optional
|
||||||
|
3. **Phase 0b:** Szenario 2 (Strategic Depth first) - 120+ Platzhalter
|
||||||
|
4. **Release Strategy:** v0.9h als stabiler Rollback-Punkt
|
||||||
|
|
||||||
|
### Offen 🤔
|
||||||
|
1. **Issue #52 (BP dual targets):** Vor oder nach Phase 0b? → **Empfehlung: Nach Phase 0b**
|
||||||
|
2. **Frontend Components:** Extract während oder nach Split? → **Empfehlung: Nach, wenn Zeit**
|
||||||
|
3. **Issue #49 (Prompt pages):** Vor oder nach Phase 0b? → **Empfehlung: Nach Phase 0b**
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🚨 Aktuelle Blocker / Risiken
|
||||||
|
|
||||||
|
### Keine kritischen Blocker ✅
|
||||||
|
|
||||||
|
**Kleine Risiken:**
|
||||||
|
1. ⚠️ **Vitals baseline fix:** Gerade deployed, needs testing
|
||||||
|
2. ⚠️ **Migration 032:** Muss auf Prod laufen (dev läuft bereits)
|
||||||
|
3. ⚠️ **Code Splitting:** Könnte Regressionen einführen → gründliches Testing
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📞 Ansprechpunkte für Wiederaufnahme
|
||||||
|
|
||||||
|
**Wenn du zu diesem Stand zurückkehrst:**
|
||||||
|
|
||||||
|
1. **Lies zuerst:**
|
||||||
|
- Dieses Dokument (STATUS_2026-03-27.md)
|
||||||
|
- CLAUDE.md (aktuelle Version)
|
||||||
|
- docs/NEXT_STEPS_2026-03-26.md (Roadmap)
|
||||||
|
|
||||||
|
2. **Prüfe:**
|
||||||
|
- Ist v0.9h deployed? `git describe --tags`
|
||||||
|
- Läuft dev/prod? `curl https://dev.mitai.jinkendo.de/api/version`
|
||||||
|
- Gitea Issues-Status aktuell?
|
||||||
|
|
||||||
|
3. **Nächster Schritt:**
|
||||||
|
- Falls v0.9h deployed: Start Code Splitting
|
||||||
|
- Falls nicht: Führe Testing-Checklist aus (siehe Phase 1 oben)
|
||||||
|
|
||||||
|
4. **Claude Code Context:**
|
||||||
|
```
|
||||||
|
"Wir sind bei v0.9h Release. Goals-System ist komplett (Phase 0a + Dynamic Focus Areas v2.0).
|
||||||
|
Nächster Schritt: [Testing/Code Splitting/Phase 0b] - siehe STATUS_2026-03-27.md"
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📈 Metriken seit letztem Stand
|
||||||
|
|
||||||
|
**Commits seit v0.9g:**
|
||||||
|
- 6 Commits (Goals fixes, Focus Areas v2.0, Vitals baseline fix)
|
||||||
|
- +1200 Zeilen (neue Features)
|
||||||
|
- -400 Zeilen (Refactoring)
|
||||||
|
|
||||||
|
**Issues:**
|
||||||
|
- 3 geschlossen (#50, #51, #48)
|
||||||
|
- 1 neu (#52)
|
||||||
|
- 1 sollte geschlossen werden (#25)
|
||||||
|
|
||||||
|
**Deployment:**
|
||||||
|
- Letzte 3 Deployments erfolgreich
|
||||||
|
- Dev-Environment stabil
|
||||||
|
- Prod auf v0.9g (stabil)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Erstellt:** 27. März 2026, 22:30 Uhr
|
||||||
|
**Von:** Claude Code (Sonnet 4.5)
|
||||||
|
**Nächstes Update:** Nach v0.9h Release
|
||||||
194
docs/STATUS_REPORT_2026-03-26.md
Normal file
194
docs/STATUS_REPORT_2026-03-26.md
Normal file
|
|
@ -0,0 +1,194 @@
|
||||||
|
# Status Report: 26. März 2026
|
||||||
|
|
||||||
|
## Audit & Synchronisation
|
||||||
|
|
||||||
|
Vollständige Überprüfung aller Dokumente und Gitea Issues durchgeführt.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## ✅ Abgeschlossene Arbeiten
|
||||||
|
|
||||||
|
### 1. Gitea Issue #28: AI-Prompts Flexibilisierung
|
||||||
|
**Status:** ✅ CLOSED (26.03.2026)
|
||||||
|
|
||||||
|
**Implementierte Features:**
|
||||||
|
- Unified Prompt System (4 Phasen)
|
||||||
|
- DB-Migration zu einheitlichem Schema (base + pipeline)
|
||||||
|
- Universeller Executor (prompt_executor.py)
|
||||||
|
- Frontend UI Consolidation (UnifiedPromptModal)
|
||||||
|
- Debug & Development Tools (Test-Button, Export/Import)
|
||||||
|
- 32 aktive Platzhalter mit Kategorisierung
|
||||||
|
- `{{placeholder|d}}` Modifier
|
||||||
|
|
||||||
|
**Commits:** 20+ commits (2e0838c bis ae6bd0d)
|
||||||
|
**Dokumentation:** CLAUDE.md "Feature: Unified Prompt System"
|
||||||
|
|
||||||
|
**Gitea Aktion:** Issue geschlossen mit Completion-Kommentar
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### 2. Gitea Issue #44: BUG - Analysen löschen
|
||||||
|
**Status:** ✅ CLOSED (26.03.2026)
|
||||||
|
|
||||||
|
**Fix:**
|
||||||
|
- Delete-Button in InsightCard hinzugefügt
|
||||||
|
- `api.deleteInsight(id)` Funktion implementiert
|
||||||
|
- Auth-Token wird korrekt übergeben
|
||||||
|
- Liste aktualisiert sich nach Löschen
|
||||||
|
|
||||||
|
**Commit:** c56d2b2
|
||||||
|
**Dokumentation:** Gitea-Kommentar mit Code-Beispiel
|
||||||
|
|
||||||
|
**Gitea Aktion:** Issue geschlossen mit Fix-Details
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### 3. Feature: Comprehensive Value Table
|
||||||
|
**Status:** ✅ Basis-Implementierung COMPLETE (26.03.2026)
|
||||||
|
|
||||||
|
**Implementierte Features:**
|
||||||
|
- Metadata Collection System (alle Platzhalter mit Werten)
|
||||||
|
- Expert Mode Toggle (🔬 Experten-Modus)
|
||||||
|
- Stage Output Extraction (Einzelwerte aus JSON)
|
||||||
|
- Category Grouping (PROFIL, KÖRPER, ERNÄHRUNG, etc.)
|
||||||
|
- Collapsible JSON für Stage-Rohdaten
|
||||||
|
- Best-of-Each circ_summary mit Altersangaben
|
||||||
|
|
||||||
|
**Commits:** 10+ commits (c0a50de bis 6e651b5, 159fcab)
|
||||||
|
**Dokumentation:** CLAUDE.md "Feature: Comprehensive Value Table"
|
||||||
|
|
||||||
|
**Gitea:** Basis abgeschlossen, Issue #47 für Refinement erstellt
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### 4. Placeholder System Enhancements
|
||||||
|
**Status:** ✅ COMPLETE
|
||||||
|
|
||||||
|
**Fixes & Verbesserungen:**
|
||||||
|
- `circ_summary`: Alle 8 Umfangspunkte (statt nur 3)
|
||||||
|
- `circ_summary`: Best-of-Each mit Altersangaben ("heute", "vor 2 Wochen")
|
||||||
|
- `sleep_avg_quality`: Lowercase stage names fix
|
||||||
|
- `calculate_age`: PostgreSQL DATE object handling
|
||||||
|
- Stage outputs in debug info für Value Table
|
||||||
|
|
||||||
|
**Commits:** 7daa2e4, a43a9f1, 3ad1a19, d06d3d8, 159fcab, 6e651b5
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🔲 Neue/Offene Issues
|
||||||
|
|
||||||
|
### Gitea Issue #47: Wertetabelle Optimierung
|
||||||
|
**Status:** 🔲 OPEN (neu erstellt 26.03.2026)
|
||||||
|
**Priority:** Medium
|
||||||
|
**Aufwand:** 4-6 Stunden
|
||||||
|
|
||||||
|
**Ziel:** Value Table übersichtlicher gestalten
|
||||||
|
|
||||||
|
**Kernpunkte:**
|
||||||
|
- Normal-Modus: Nur Einzelwerte (~24 statt 32)
|
||||||
|
- Experten-Modus: Zusätzlich Stage-Rohdaten
|
||||||
|
- Beschreibungen für alle 32 Platzhalter vervollständigen
|
||||||
|
- Schema-basierte Beschreibungen für extrahierte Werte
|
||||||
|
|
||||||
|
**Dokumentation:** `docs/issues/issue-50-value-table-refinement.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📊 Gitea Issue Übersicht
|
||||||
|
|
||||||
|
### Geschlossen (heute)
|
||||||
|
- ✅ #28: AI-Prompts Flexibilisierung
|
||||||
|
- ✅ #44: BUG - Analysen löschen
|
||||||
|
|
||||||
|
### Neu erstellt (heute)
|
||||||
|
- 🆕 #47: Wertetabelle Optimierung
|
||||||
|
|
||||||
|
### Weiterhin offen (Backlog)
|
||||||
|
- 🔲 #25: Ziele-System (Goals)
|
||||||
|
- 🔲 #26: Charts erweitern
|
||||||
|
- 🔲 #27: Korrelationen & Insights
|
||||||
|
- 🔲 #29: Abilities-Matrix UI
|
||||||
|
- 🔲 #30: Responsive UI
|
||||||
|
- 🔲 #42, #43: Enhanced Debug UI
|
||||||
|
- 🔲 #45: KI Prompt-Optimierer
|
||||||
|
- 🔲 #46: KI Prompt-Ersteller
|
||||||
|
|
||||||
|
### Bereits geschlossen (früher)
|
||||||
|
- ✅ #24: Quality-Filter für KI-Auswertungen
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📝 Dokumentations-Updates
|
||||||
|
|
||||||
|
### CLAUDE.md
|
||||||
|
- ✅ "Letzte Updates (26.03.2026)" Sektion hinzugefügt
|
||||||
|
- ✅ Gitea Issue-Referenzen klargestellt (Prefix "Gitea #")
|
||||||
|
- ✅ Feature-Sections umbenannt (nicht "Issue #28/47")
|
||||||
|
- ✅ "Claude Code Verantwortlichkeiten" Sektion
|
||||||
|
- ✅ Issue-Management via Gitea API dokumentiert
|
||||||
|
|
||||||
|
### docs/issues/
|
||||||
|
- ✅ issue-50-value-table-refinement.md erstellt
|
||||||
|
- ℹ️ Weitere Files in .claude/issues/ (nicht versioniert)
|
||||||
|
|
||||||
|
### Gitea Kommentare
|
||||||
|
- ✅ Issue #28: Completion-Details mit Features & Commits
|
||||||
|
- ✅ Issue #44: Fix-Details mit Code-Beispiel
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🔄 Nächste Schritte
|
||||||
|
|
||||||
|
### Empfohlen (Kurzfristig)
|
||||||
|
1. **Testing auf dev.mitai.jinkendo.de:**
|
||||||
|
- Value Table im Experten-Modus testen
|
||||||
|
- Stage-Outputs JSON Anzeige prüfen
|
||||||
|
- circ_summary mit Altersangaben verifizieren
|
||||||
|
|
||||||
|
2. **Production Deployment:**
|
||||||
|
- Develop → Main Merge (wenn Tests OK)
|
||||||
|
- Alle Features (Unified Prompts + Value Table) deployen
|
||||||
|
|
||||||
|
3. **Issue #47 Refinement:**
|
||||||
|
- Wertetabelle im Normal-Modus optimieren
|
||||||
|
- Beschreibungen vervollständigen
|
||||||
|
|
||||||
|
### Optional (Mittelfristig)
|
||||||
|
4. **Weitere offene Issues priorisieren:**
|
||||||
|
- #25: Ziele-System (Phase 1)
|
||||||
|
- #27: Korrelationen (Phase 2)
|
||||||
|
- #30: Responsive UI (Phase 0)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📈 Metriken
|
||||||
|
|
||||||
|
**Commits (heute):** 12
|
||||||
|
**Issues geschlossen:** 2 (#28, #44)
|
||||||
|
**Issues erstellt:** 1 (#47)
|
||||||
|
**Dokumentations-Updates:** 3 (CLAUDE.md, STATUS_REPORT, issue-50)
|
||||||
|
**Gitea Kommentare:** 2
|
||||||
|
|
||||||
|
**Entwicklungszeit (geschätzt):** ~6-8 Stunden
|
||||||
|
- circ_summary Enhancement: 1h
|
||||||
|
- Stage Outputs Fix: 1h
|
||||||
|
- Value Table Collapsible JSON: 1h
|
||||||
|
- Issue-Management System: 1h
|
||||||
|
- Dokumentation & Sync: 2-4h
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## ✅ Verifizierung
|
||||||
|
|
||||||
|
- [x] Alle Gitea Issues überprüft (47 Issues total)
|
||||||
|
- [x] Abgeschlossene Arbeiten identifiziert (#28, #44)
|
||||||
|
- [x] Issues in Gitea geschlossen
|
||||||
|
- [x] Completion-Kommentare hinzugefügt
|
||||||
|
- [x] CLAUDE.md aktualisiert
|
||||||
|
- [x] Status Report erstellt
|
||||||
|
- [x] Entwicklungs-Dokumentation aktuell
|
||||||
|
|
||||||
|
**Audit durchgeführt von:** Claude Code
|
||||||
|
**Datum:** 26. März 2026, 14:55 Uhr
|
||||||
|
**Branch:** develop
|
||||||
|
**Letzter Commit:** 582f125
|
||||||
284
docs/TODO_GOAL_SYSTEM.md
Normal file
284
docs/TODO_GOAL_SYSTEM.md
Normal file
|
|
@ -0,0 +1,284 @@
|
||||||
|
# Goal System - TODO & Offene Punkte
|
||||||
|
|
||||||
|
**Erstellt:** 27. März 2026
|
||||||
|
**Status:** Aktiv
|
||||||
|
**Zweck:** Zentrale Tracking-Liste für Goal System Entwicklung
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## ✅ Erledigt (27.03.2026)
|
||||||
|
|
||||||
|
### Phase 0a: Minimal Goal System (26.03.2026)
|
||||||
|
- ✅ Migration 022 (goal_mode, goals, training_phases, fitness_tests)
|
||||||
|
- ✅ Backend Router goals.py (490 Zeilen)
|
||||||
|
- ✅ Frontend GoalsPage (570 Zeilen)
|
||||||
|
- ✅ Navigation Integration (Dashboard + Analysis)
|
||||||
|
|
||||||
|
### Phase 1: Quick Fixes (27.03.2026)
|
||||||
|
- ✅ goal_utils.py Abstraction Layer
|
||||||
|
- ✅ Primary Goal Toggle Fix
|
||||||
|
- ✅ Lean Mass Berechnung
|
||||||
|
- ✅ VO2Max Spaltenname Fix
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🔲 Nächste Schritte (Priorität)
|
||||||
|
|
||||||
|
### Phase 1.5: Flexibles Goal System - DB-Registry ✅ KOMPLETT (27.03.2026)
|
||||||
|
|
||||||
|
**Status:** ✅ ABGESCHLOSSEN
|
||||||
|
**Priorität:** CRITICAL (blockt Phase 0b)
|
||||||
|
**Aufwand:** 8h (geplant 8-12h)
|
||||||
|
**Entscheidung:** 27.03.2026 - Option B gewählt
|
||||||
|
|
||||||
|
**Problem:**
|
||||||
|
- Aktuelles System: Hardcoded goal types (nur 8 Typen möglich)
|
||||||
|
- Jedes neue Ziel braucht Code-Änderung + Deploy
|
||||||
|
- Zukünftige Ziele (Meditation, Rituale, Planabweichung) nicht möglich
|
||||||
|
|
||||||
|
**Lösung: DB-Registry**
|
||||||
|
- Goal Types in Datenbank definiert
|
||||||
|
- Admin UI: Neue Ziele ohne Code erstellen
|
||||||
|
- Universal Value Fetcher (konfigurierbar)
|
||||||
|
- User kann eigene Custom-Metriken definieren
|
||||||
|
|
||||||
|
**Tasks:**
|
||||||
|
- ✅ Migration 024: goal_type_definitions Tabelle
|
||||||
|
- ✅ Backend: Universal Value Fetcher (_fetch_latest, _fetch_avg, _fetch_count)
|
||||||
|
- ✅ Backend: CRUD API für Goal Type Definitions
|
||||||
|
- ✅ Frontend: Dynamisches Goal Types Dropdown
|
||||||
|
- ✅ Admin UI: Goal Type Management Page
|
||||||
|
- ✅ Seed Data: 8 existierende Typen migriert
|
||||||
|
- 🔲 Testing: Alle Goals + Custom Goal erstellen (NEXT)
|
||||||
|
|
||||||
|
**Warum JETZT (vor Phase 0b)?**
|
||||||
|
- Phase 0b Platzhalter nutzen Goals für Score-Berechnungen
|
||||||
|
- Flexible Goals → automatisch in Platzhaltern verfügbar
|
||||||
|
- Später umbauen = 120+ Platzhalter anpassen (Doppelarbeit)
|
||||||
|
|
||||||
|
**Dokumentation:** Siehe unten "Flexibles Goal System Details"
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Phase 0b: Goal-Aware Placeholders (NACH 1.5 - 16-20h)
|
||||||
|
|
||||||
|
**Status:** 🔲 BEREIT ZUM START (Phase 1.5 ✅)
|
||||||
|
**Priorität:** HIGH (strategisch kritisch)
|
||||||
|
**Aufwand:** 16-20h
|
||||||
|
**Blockt:** Intelligente KI-Analysen
|
||||||
|
|
||||||
|
**Tasks:**
|
||||||
|
- [ ] 18 KÖRPER Platzhalter (weight_7d_rolling_median, fm_28d_delta, lbm_28d_delta, recomposition_score, etc.)
|
||||||
|
- [ ] 15 ERNÄHRUNG Platzhalter (protein_g_per_kg, nutrition_adherence_score, energy_availability_status, etc.)
|
||||||
|
- [ ] 25 AKTIVITÄT Platzhalter (activity_quality_avg_28d, activity_strain_28d, ability_balance_score, etc.)
|
||||||
|
- [ ] 12 RECOVERY Platzhalter (recovery_score, sleep_regularity_index, sleep_debt_hours, etc.)
|
||||||
|
- [ ] 8 KORRELATIONEN Platzhalter (corr_energy_weight_lag, plateau_detected, etc.)
|
||||||
|
- [ ] 6 META Platzhalter (goal_mode, data_quality_score, profile_age_years, etc.)
|
||||||
|
- [ ] Score-Gewichtung pro goal_mode (SCORE_WEIGHTS Dictionary)
|
||||||
|
- [ ] Baseline-Berechnungen (7d/28d/90d Referenzwerte)
|
||||||
|
- [ ] Integration in bestehende Prompts
|
||||||
|
|
||||||
|
**Vorteile:**
|
||||||
|
- System wird "intelligent" (kein Datensammler mehr)
|
||||||
|
- Ziele werden tatsächlich genutzt
|
||||||
|
- Basis für automatische Trainingsphasen-Erkennung
|
||||||
|
|
||||||
|
**Dokumentation:** `docs/NEXT_STEPS_2026-03-26.md` (Zeile 116-300)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### v2.0 Redesign (SPÄTER - 8-10h)
|
||||||
|
|
||||||
|
**Status:** 📋 KONZEPTION
|
||||||
|
**Priorität:** MEDIUM (nach Phase 0b & User-Feedback)
|
||||||
|
**Aufwand:** 8-10h (dank Abstraction Layer)
|
||||||
|
|
||||||
|
**Probleme zu lösen:**
|
||||||
|
1. ❌ Primärziel zu simplistisch (nur 1 erlaubt)
|
||||||
|
2. ❌ Goal Mode zu simpel (nur 1 Modus wählbar)
|
||||||
|
3. ✅ Fehlende Current Values (ERLEDIGT in Phase 1)
|
||||||
|
4. ❌ Abstrakte Zieltypen (strength, flexibility)
|
||||||
|
5. ❌ Blutdruck braucht 2 Werte (systolisch/diastolisch)
|
||||||
|
6. ❌ Keine Guidance für User (Richtwerte fehlen)
|
||||||
|
|
||||||
|
**Lösung:**
|
||||||
|
- Migration 023: focus_areas Tabelle mit Gewichtungssystem
|
||||||
|
- UI: Slider für 6 Fokus-Bereiche (Summe = 100%)
|
||||||
|
- Backend: `get_focus_weights()` V2 Implementierung (eine Funktion!)
|
||||||
|
- Compound Goals für BP
|
||||||
|
- Konkrete Test-basierte Goals (Cooper, Plank, etc.)
|
||||||
|
- Richtwerte & Normen in UI
|
||||||
|
|
||||||
|
**Dokumentation:** `docs/GOAL_SYSTEM_REDESIGN_v2.md`
|
||||||
|
|
||||||
|
**Entscheidung:** ⏳ Wartet auf User-Feedback nach Phase 0b
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🔗 Verwandte Issues
|
||||||
|
|
||||||
|
### Gitea (http://192.168.2.144:3000/Lars/mitai-jinkendo/issues)
|
||||||
|
- **#49:** Prompt-Zuordnung zu Verlaufsseiten (6-8h, Quick Win)
|
||||||
|
- **#47:** Wertetabelle Optimierung (4-6h, Polishing)
|
||||||
|
- **#50:** Phase 0a Goal System (✅ CLOSED)
|
||||||
|
|
||||||
|
### Interne Docs
|
||||||
|
- `docs/issues/issue-50-phase-0a-goal-system.md` (✅ Completed)
|
||||||
|
- `docs/issues/issue-51-prompt-page-assignment.md` (#49 Spec)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📊 Roadmap-Übersicht
|
||||||
|
|
||||||
|
| Phase | Was | Status | Aufwand |
|
||||||
|
|-------|-----|--------|---------|
|
||||||
|
| **Phase 0a** | Minimal Goal System | ✅ DONE | 3-4h |
|
||||||
|
| **Phase 1** | Quick Fixes + Abstraction | ✅ DONE | 4-6h |
|
||||||
|
| **Phase 1.5** | 🆕 **Flexibles Goal System (DB-Registry)** | ✅ **DONE** | 8h |
|
||||||
|
| **Phase 0b** | Goal-Aware Placeholders | 🔲 READY | 16-20h |
|
||||||
|
| **Issue #49** | Prompt Page Assignment | 🔲 OPEN | 6-8h |
|
||||||
|
| **v2.0** | Redesign (Focus Areas) | 📋 LATER | 8-10h |
|
||||||
|
|
||||||
|
**Total Roadmap:** ~45-60h bis vollständiges intelligentes Goal System
|
||||||
|
|
||||||
|
**KRITISCH:** Phase 1.5 MUSS vor Phase 0b abgeschlossen sein, sonst Doppelarbeit!
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 💡 Wichtige Notizen
|
||||||
|
|
||||||
|
### Abstraction Layer (Keine Doppelarbeit!)
|
||||||
|
**Datei:** `backend/goal_utils.py`
|
||||||
|
|
||||||
|
```python
|
||||||
|
get_focus_weights(conn, profile_id)
|
||||||
|
```
|
||||||
|
|
||||||
|
- **V1 (jetzt):** Mappt goal_mode → Gewichte
|
||||||
|
- **V2 (v2.0):** Liest focus_areas Tabelle
|
||||||
|
- **Vorteil:** 120+ Phase 0b Platzhalter müssen NICHT umgeschrieben werden
|
||||||
|
|
||||||
|
### Testing Checklist (nach jedem Deploy)
|
||||||
|
- [ ] Goal Mode ändern → Gewichtung korrekt?
|
||||||
|
- [ ] Primäres Ziel setzen → Andere auf false?
|
||||||
|
- [ ] Lean Mass Ziel → Current Value berechnet?
|
||||||
|
- [ ] VO2Max Ziel → Kein Server Error?
|
||||||
|
- [ ] Mehrere Ziele → Progress korrekt?
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📅 Timeline
|
||||||
|
|
||||||
|
| Datum | Event |
|
||||||
|
|-------|-------|
|
||||||
|
| 26.03.2026 | Phase 0a Complete |
|
||||||
|
| 27.03.2026 | Phase 1 Complete (Quick Fixes) |
|
||||||
|
| 28.03.2026 | **Phase 0b Start (geplant)** |
|
||||||
|
| 02.04.2026 | Phase 0b Complete (geschätzt bei 4h/Tag) |
|
||||||
|
| 04.04.2026 | v2.0 Redesign (wenn validiert) |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🔧 Flexibles Goal System - Technische Details
|
||||||
|
|
||||||
|
### Architektur: DB-Registry Pattern
|
||||||
|
|
||||||
|
**Vorher (Phase 0a/1):**
|
||||||
|
```javascript
|
||||||
|
// Frontend: Hardcoded
|
||||||
|
const GOAL_TYPES = {
|
||||||
|
weight: { label: 'Gewicht', unit: 'kg', icon: '⚖️' }
|
||||||
|
}
|
||||||
|
|
||||||
|
// Backend: Hardcoded if/elif
|
||||||
|
if goal_type == 'weight':
|
||||||
|
cur.execute("SELECT weight FROM weight_log...")
|
||||||
|
elif goal_type == 'body_fat':
|
||||||
|
cur.execute("SELECT body_fat_pct FROM caliper_log...")
|
||||||
|
```
|
||||||
|
|
||||||
|
**Nachher (Phase 1.5):**
|
||||||
|
```sql
|
||||||
|
-- Datenbank: Konfigurierbare Goal Types
|
||||||
|
CREATE TABLE goal_type_definitions (
|
||||||
|
type_key VARCHAR(50) UNIQUE,
|
||||||
|
label_de VARCHAR(100),
|
||||||
|
unit VARCHAR(20),
|
||||||
|
icon VARCHAR(10),
|
||||||
|
category VARCHAR(50),
|
||||||
|
source_table VARCHAR(50),
|
||||||
|
source_column VARCHAR(50),
|
||||||
|
aggregation_method VARCHAR(20), -- latest, avg_7d, count_7d, etc.
|
||||||
|
calculation_formula TEXT, -- JSON für komplexe Berechnungen
|
||||||
|
is_system BOOLEAN -- System-Typen nicht löschbar
|
||||||
|
);
|
||||||
|
```
|
||||||
|
|
||||||
|
```python
|
||||||
|
# Backend: Universal Fetcher
|
||||||
|
def get_current_value_for_goal(conn, profile_id, goal_type):
|
||||||
|
"""Liest Config aus DB, führt Query aus"""
|
||||||
|
config = get_goal_type_config(conn, goal_type)
|
||||||
|
|
||||||
|
if config['calculation_formula']:
|
||||||
|
return execute_formula(conn, profile_id, config['calculation_formula'])
|
||||||
|
else:
|
||||||
|
return fetch_by_method(
|
||||||
|
conn, profile_id,
|
||||||
|
config['source_table'],
|
||||||
|
config['source_column'],
|
||||||
|
config['aggregation_method']
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// Frontend: Dynamisch
|
||||||
|
const goalTypes = await api.getGoalTypeDefinitions()
|
||||||
|
// Lädt aktuell verfügbare Typen von API
|
||||||
|
```
|
||||||
|
|
||||||
|
### Vorteile:
|
||||||
|
|
||||||
|
**Flexibilität:**
|
||||||
|
- ✅ Neue Ziele via Admin UI (KEIN Code-Deploy)
|
||||||
|
- ✅ User kann Custom-Metriken definieren
|
||||||
|
- ✅ Zukünftige Module automatisch integriert
|
||||||
|
|
||||||
|
**Beispiele neuer Ziele:**
|
||||||
|
- 🧘 Meditation (min/Tag) → `meditation_log.duration_minutes`, avg_7d
|
||||||
|
- 📅 Trainingshäufigkeit (x/Woche) → `activity_log.id`, count_7d
|
||||||
|
- 📊 Planabweichung (%) → `activity_log.planned_vs_actual`, avg_30d
|
||||||
|
- 🎯 Ritual-Adherence (%) → `rituals_log.completed`, avg_30d
|
||||||
|
- 💤 Schlafqualität (%) → `sleep_log.quality_score`, avg_7d
|
||||||
|
|
||||||
|
**Integration mit Phase 0b:**
|
||||||
|
- Platzhalter nutzen `get_current_value_for_goal()` → automatisch alle Typen verfügbar
|
||||||
|
- Neue Ziele → sofort in KI-Analysen nutzbar
|
||||||
|
- Keine Platzhalter-Anpassungen nötig
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Letzte Aktualisierung:** 27. März 2026 (Phase 1.5 ✅ ABGESCHLOSSEN)
|
||||||
|
**Nächste Aktualisierung:** Nach Phase 0b Completion
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎉 Phase 1.5 Completion Report (27.03.2026)
|
||||||
|
|
||||||
|
**Commits:**
|
||||||
|
- `65ee5f8` - Phase 1.5 Part 1/2 (Backend, Migration, Universal Fetcher)
|
||||||
|
- `640ef81` - Phase 1.5 Part 2/2 (Frontend Dynamic, Admin UI) - **COMPLETE**
|
||||||
|
|
||||||
|
**Implementiert:**
|
||||||
|
1. ✅ DB-Registry für Goal Types (8 System Types seeded)
|
||||||
|
2. ✅ Universal Value Fetcher (8 Aggregationsmethoden)
|
||||||
|
3. ✅ CRUD API (admin-only, System Types geschützt)
|
||||||
|
4. ✅ Dynamic Frontend (keine hardcoded Types mehr)
|
||||||
|
5. ✅ Admin UI (vollständiges CRUD Interface)
|
||||||
|
|
||||||
|
**System ist jetzt flexibel:**
|
||||||
|
- Neue Goal Types via UI ohne Code-Deploy
|
||||||
|
- Phase 0b Platzhalter nutzen automatisch alle Types
|
||||||
|
- Custom Metrics möglich (Meditation, Rituale, etc.)
|
||||||
|
|
||||||
|
**Ready für Phase 0b:** 120+ Goal-Aware Placeholders 🚀
|
||||||
245
docs/issues/issue-50-phase-0a-goal-system.md
Normal file
245
docs/issues/issue-50-phase-0a-goal-system.md
Normal file
|
|
@ -0,0 +1,245 @@
|
||||||
|
# Phase 0a: Minimal Goal System (Strategic + Tactical)
|
||||||
|
|
||||||
|
**Status:** ✅ ABGESCHLOSSEN (26.03.2026)
|
||||||
|
**Labels:** feature, enhancement, goal-system
|
||||||
|
**Priority:** High (Foundation for Phase 0b)
|
||||||
|
**Aufwand:** 3-4h (geschätzt) / ~4h (tatsächlich)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Beschreibung
|
||||||
|
|
||||||
|
Implementierung des minimalen Zielsystems als Basis für goal-aware KI-Analysen. Zwei-Ebenen-Architektur:
|
||||||
|
- **Strategic Layer:** Goal Modes (beeinflusst Score-Gewichtung)
|
||||||
|
- **Tactical Layer:** Konkrete Zielwerte mit Progress-Tracking
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Implementiert ✅
|
||||||
|
|
||||||
|
### Strategic Layer (Goal Modes)
|
||||||
|
- `goal_mode` in `profiles` table
|
||||||
|
- 5 Modi: `weight_loss`, `strength`, `endurance`, `recomposition`, `health`
|
||||||
|
- Bestimmt Score-Gewichtung für alle KI-Analysen
|
||||||
|
- **UI:** 5 Goal Mode Cards mit Beschreibungen und Icons
|
||||||
|
|
||||||
|
### Tactical Layer (Concrete Goals)
|
||||||
|
- `goals` table mit vollständigem Tracking:
|
||||||
|
- Target/Current/Start values
|
||||||
|
- Progress percentage (auto-calculated)
|
||||||
|
- Projection date & on-track status
|
||||||
|
- Primary/Secondary goal concept
|
||||||
|
- 8 Goal-Typen: weight, body_fat, lean_mass, vo2max, strength, flexibility, bp, rhr
|
||||||
|
- **UI:**
|
||||||
|
- Goal CRUD mit Fortschrittsbalken
|
||||||
|
- Mobile-friendly Design (full-width inputs, labels above fields)
|
||||||
|
- Inline editing vorbereitet
|
||||||
|
|
||||||
|
### Training Phases Framework
|
||||||
|
- `training_phases` table (Auto-Detection vorbereitet für Phase 2)
|
||||||
|
- 5 Phase-Typen: calorie_deficit, calorie_surplus, deload, maintenance, periodization
|
||||||
|
- Status-Flow: suggested → accepted → active → completed → rejected
|
||||||
|
- Confidence scoring für KI-basierte Erkennung
|
||||||
|
- JSONB detection_params für Flexibilität
|
||||||
|
|
||||||
|
### Fitness Tests
|
||||||
|
- `fitness_tests` table für standardisierte Tests
|
||||||
|
- 8 Test-Typen: cooper_12min, step_test, pushups_max, plank_max, flexibility_sit_reach, vo2max_est, strength_1rm_squat, strength_1rm_bench
|
||||||
|
- Norm-Kategorisierung vorbereitet (age/gender-spezifisch)
|
||||||
|
- Baseline-Tracking für Fortschrittsmessung
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Technische Umsetzung
|
||||||
|
|
||||||
|
### Backend
|
||||||
|
|
||||||
|
**Migration 022:** `backend/migrations/022_goal_system.sql`
|
||||||
|
```sql
|
||||||
|
-- Strategic Layer
|
||||||
|
ALTER TABLE profiles ADD COLUMN goal_mode VARCHAR(50) DEFAULT 'health';
|
||||||
|
|
||||||
|
-- Tactical Layer
|
||||||
|
CREATE TABLE goals (...);
|
||||||
|
CREATE TABLE training_phases (...);
|
||||||
|
CREATE TABLE fitness_tests (...);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Router:** `backend/routers/goals.py` (490 Zeilen)
|
||||||
|
- Vollständiges CRUD für alle 3 Ebenen
|
||||||
|
- Progress calculation (auto-update current values)
|
||||||
|
- Linear projection für target_date
|
||||||
|
- Helper functions für goal-type spezifische Current-Values
|
||||||
|
|
||||||
|
**API Endpoints:** `/api/goals/*`
|
||||||
|
- `GET/PUT /mode` - Strategic goal mode
|
||||||
|
- `GET /list` - All goals with progress
|
||||||
|
- `POST /create` - Create goal
|
||||||
|
- `PUT /{id}` - Update goal
|
||||||
|
- `DELETE /{id}` - Delete goal
|
||||||
|
- `GET/POST /phases` - Training phases
|
||||||
|
- `PUT /phases/{id}/status` - Accept/reject auto-detected phases
|
||||||
|
- `GET/POST /tests` - Fitness tests
|
||||||
|
|
||||||
|
### Frontend
|
||||||
|
|
||||||
|
**GoalsPage:** `frontend/src/pages/GoalsPage.jsx` (570 Zeilen)
|
||||||
|
- **Goal Mode Selector:** 5 Karten mit Icons, Farben, Beschreibungen
|
||||||
|
- **Goal List:** Cards mit Progress-Balken, Projection-Display, Edit/Delete
|
||||||
|
- **Goal Form:** Mobile-optimiertes Modal
|
||||||
|
- Full-width inputs
|
||||||
|
- Labels above fields (not beside)
|
||||||
|
- Section headers with emoji (🎯 Zielwert)
|
||||||
|
- Unit display as styled badge
|
||||||
|
- Primary goal checkbox in highlighted section
|
||||||
|
- Text-align: left für Text-Felder, right für Zahlen
|
||||||
|
- **Empty State:** Placeholder mit CTA
|
||||||
|
|
||||||
|
**Navigation Integration:**
|
||||||
|
- **Dashboard:** Goals Preview Card mit "Verwalten →" Link
|
||||||
|
- **Analysis Page:** 🎯 Ziele Button neben Titel (direkter Zugang)
|
||||||
|
- **Route:** `/goals` in App.jsx registriert
|
||||||
|
|
||||||
|
**api.js:** 15+ neue API-Funktionen
|
||||||
|
```javascript
|
||||||
|
// Goal Modes
|
||||||
|
getGoalMode(), updateGoalMode(mode)
|
||||||
|
|
||||||
|
// Goals CRUD
|
||||||
|
listGoals(), createGoal(data), updateGoal(id, data), deleteGoal(id)
|
||||||
|
|
||||||
|
// Training Phases
|
||||||
|
listTrainingPhases(), createTrainingPhase(data), updatePhaseStatus(id, status)
|
||||||
|
|
||||||
|
// Fitness Tests
|
||||||
|
listFitnessTests(), createFitnessTest(data)
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Commits
|
||||||
|
|
||||||
|
| Commit | Beschreibung |
|
||||||
|
|--------|-------------|
|
||||||
|
| `337667f` | feat: Phase 0a - Minimal Goal System (Strategic + Tactical) |
|
||||||
|
| `906a3b7` | fix: Migration 022 - remove invalid schema_migrations tracking |
|
||||||
|
| `75f0a5d` | refactor: mobile-friendly goal form design |
|
||||||
|
| `5be52bc` | feat: goals navigation + UX improvements |
|
||||||
|
|
||||||
|
**Branch:** `develop`
|
||||||
|
**Deployed to:** `dev.mitai.jinkendo.de` ✅
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Dokumentation
|
||||||
|
|
||||||
|
- ✅ `docs/GOALS_SYSTEM_UNIFIED_ANALYSIS.md` (538 Zeilen)
|
||||||
|
- Analyse beider Fachkonzepte (Konzept v2 + GOALS_VITALS.md)
|
||||||
|
- Zwei-Ebenen-Architektur erklärt
|
||||||
|
- 120+ Placeholder-Kategorisierung für Phase 0b
|
||||||
|
- ✅ Migration 022 mit vollständigen COMMENT ON statements
|
||||||
|
- ✅ API-Dokumentation in Router-Docstrings
|
||||||
|
- ✅ Dieses Issue-Dokument
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Basis für Phase 0b
|
||||||
|
|
||||||
|
Phase 0a bietet die Foundation für:
|
||||||
|
|
||||||
|
### Phase 0b: Goal-Aware Placeholders (16-20h)
|
||||||
|
- ✅ 120+ neue Platzhalter die `goal_mode` berücksichtigen
|
||||||
|
- ✅ Score-Berechnungen abhängig von Strategic Layer
|
||||||
|
- ✅ Baseline-Berechnungen (7d/28d/90d Trends)
|
||||||
|
- ✅ Lag-basierte Korrelationen
|
||||||
|
- ✅ Confidence Scoring
|
||||||
|
|
||||||
|
**Beispiel Goal-Mode Impact:**
|
||||||
|
```python
|
||||||
|
# Gleiche Daten, unterschiedliche Interpretation:
|
||||||
|
Δ: -5kg FM, -2kg LBM
|
||||||
|
|
||||||
|
goal_mode = "weight_loss"
|
||||||
|
→ body_progress_score = 78/100 (FM↓ gut, LBM↓ tolerierbar)
|
||||||
|
|
||||||
|
goal_mode = "strength"
|
||||||
|
→ body_progress_score = 32/100 (LBM↓ ist KATASTROPHE!)
|
||||||
|
|
||||||
|
goal_mode = "health"
|
||||||
|
→ body_progress_score = 50/100 (neutral, ohne Bias)
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Testing
|
||||||
|
|
||||||
|
✅ Migration erfolgreich auf dev.mitai.jinkendo.de
|
||||||
|
✅ Goal Mode wechselbar
|
||||||
|
✅ Goal CRUD funktioniert
|
||||||
|
✅ Progress calculation korrekt
|
||||||
|
✅ Mobile UI responsive
|
||||||
|
✅ Navigation von Dashboard + Analysis
|
||||||
|
|
||||||
|
**Manuelle Tests durchgeführt:**
|
||||||
|
- [x] Goal Mode ändern
|
||||||
|
- [x] Ziel erstellen (alle 8 Typen)
|
||||||
|
- [x] Ziel bearbeiten
|
||||||
|
- [x] Ziel löschen
|
||||||
|
- [x] Primary Goal setzen
|
||||||
|
- [x] Progress-Balken korrekt
|
||||||
|
- [x] Mobile UI full-width
|
||||||
|
- [x] Text-Align korrekt
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Akzeptanzkriterien
|
||||||
|
|
||||||
|
- [x] Migration 022 erfolgreich
|
||||||
|
- [x] Goal Mode in profiles funktioniert
|
||||||
|
- [x] Goals CRUD vollständig
|
||||||
|
- [x] Progress-Tracking funktioniert
|
||||||
|
- [x] Primary Goal Konzept implementiert
|
||||||
|
- [x] Mobile-friendly UI
|
||||||
|
- [x] Navigation von 2+ Stellen
|
||||||
|
- [x] API-Dokumentation vollständig
|
||||||
|
- [x] Frontend form validation
|
||||||
|
- [x] Error handling korrekt
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Nächste Schritte
|
||||||
|
|
||||||
|
**Empfohlen:**
|
||||||
|
|
||||||
|
1. **Option A: Issue #49 - Prompt Page Assignment (6-8h)**
|
||||||
|
- Prompts auf Verlaufsseiten zuordnen
|
||||||
|
- Quick Win für bessere UX
|
||||||
|
- Nutzt bestehendes Unified Prompt System
|
||||||
|
|
||||||
|
2. **Option B: Phase 0b - Goal-Aware Placeholders (16-20h)**
|
||||||
|
- 120+ neue Platzhalter
|
||||||
|
- Score-Berechnungen mit goal_mode
|
||||||
|
- Größter strategischer Impact
|
||||||
|
|
||||||
|
**Siehe:** `docs/NEXT_STEPS_2026-03-26.md` für detaillierte Planung
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Lessons Learned
|
||||||
|
|
||||||
|
### Was gut lief:
|
||||||
|
- ✅ Zwei-Ebenen-Architektur (Strategic + Tactical) macht Sinn
|
||||||
|
- ✅ Mobile-first Design von Anfang an
|
||||||
|
- ✅ Unified Analysis vor Implementierung (beide Fachkonzepte)
|
||||||
|
- ✅ Migration-System funktioniert einwandfrei
|
||||||
|
|
||||||
|
### Was zu beachten ist:
|
||||||
|
- ⚠️ Schema_migrations verwendet `filename`, nicht `version`
|
||||||
|
- ⚠️ Unnötige DO-Blocks in Migrationen vermeiden
|
||||||
|
- ⚠️ Text-align: right als Default in form-input (für Textfelder überschreiben)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Erstellt:** 26. März 2026
|
||||||
|
**Status:** ✅ COMPLETE - Ready for Phase 0b
|
||||||
|
**Related Issues:** #49 (Prompt Assignment), #47 (Value Table Refinement)
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user