diff --git a/.env b/.env index b684859..6f5a9d4 100644 --- a/.env +++ b/.env @@ -46,6 +46,11 @@ TEST_APP_DB_DSN=postgresql://local-Python:Neo-local-1991125@100.64.0.4:5432/test TIMEZONE=Asia/Shanghai LOG_LEVEL=INFO +# ------------------------------------------------------------------------------ +# 营业日切点(统计日/周/月分割小时,默认 8 即 08:00) +# ------------------------------------------------------------------------------ +BUSINESS_DAY_START_HOUR=8 + # ============================================================================== # 统一输出路径配置(export/ 目录) # ============================================================================== @@ -105,4 +110,40 @@ BACKEND_LOG_ROOT=C:/NeoZQYY/export/BACKEND/LOGS # CHANGE 2026-02-23 | 从 PRD 文档迁移至 .env,禁止在文档中明文存放 # ------------------------------------------------------------------------------ BAILIAN_API_KEY=sk-6def29cab3474cc797e52b82a46a5dba +BAILIAN_MODEL=qwen-plus +BAILIAN_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1 BAILIAN_TEST_APP_ID=541edb3d5fcd4c18b13cbad81bb5fb9d + +# CHANGE 2026-03-05 | 8 个百炼 AI 应用 ID(从百炼平台获取,2026-03-05 更新) +BAILIAN_APP_ID_1_CHAT=979dabe6f22a43989632b8c662cac97c +BAILIAN_APP_ID_2_FINANCE=1dcdb5f39c3040b6af8ef79215b9b051 +BAILIAN_APP_ID_3_CLUE=708bf45439cd48c7ab9a514d03482890 +BAILIAN_APP_ID_4_ANALYSIS=ea7b1c374f574b9a925a2fb5789a9b90 +BAILIAN_APP_ID_5_TACTICS=46f54e6053df4bb0b83be29366025cf6 +BAILIAN_APP_ID_6_NOTE=025bb344146b4e4e8be30c444adab3b4 +BAILIAN_APP_ID_7_CUSTOMER=df35e06991b24d49971c03c6428a9c87 +BAILIAN_APP_ID_8_CONSOLIDATE=407dfb89283b4196934eec5fefe3ebc2 + +# ------------------------------------------------------------------------------ +# 微信小程序 +# CHANGE 2026-02-27 | 开发模式启用 mock 登录,跳过微信 code2Session +# 生产环境需设置真实 WX_APPID / WX_SECRET 并关闭 WX_DEV_MODE +# ------------------------------------------------------------------------------ +WX_APPID=wx7c07793d82732921 +WX_SECRET=wx7c07793d82732921 +WX_DEV_MODE=true + +# ------------------------------------------------------------------------------ +# 管道限流配置(RateLimiter 请求间隔) +# CHANGE 2026-03-06 | 从默认 5-20s 降至 0.1-2s,ODS_PAYMENT 46请求从582s→~48s +# ------------------------------------------------------------------------------ +PIPELINE_RATE_MIN=0.1 +PIPELINE_RATE_MAX=2.0 + +# ------------------------------------------------------------------------------ +# 后端运维面板路径配置 +# CHANGE 2026-03-06 | 显式锁定,避免 __file__ 推算在不同部署环境指向错误路径 +# ------------------------------------------------------------------------------ +OPS_SERVER_BASE=C:/NeoZQYY +ETL_PROJECT_PATH=C:/NeoZQYY/apps/etl/connectors/feiqiu +ETL_PYTHON_EXECUTABLE=C:/NeoZQYY/.venv/Scripts/python.exe diff --git a/.env.template b/.env.template index a9ae234..2dda883 100644 --- a/.env.template +++ b/.env.template @@ -52,6 +52,14 @@ TEST_APP_DB_DSN=postgresql://user:password@host:5432/test_zqyy_app TIMEZONE=Asia/Shanghai LOG_LEVEL=INFO +# ------------------------------------------------------------------------------ +# 营业日切点(统计日/周/月分割小时,默认 8 即 08:00) +# 日统计 = 当日 08:00 ~ 次日 08:00 +# 月统计 = 当月1日 08:00 ~ 次月1日 08:00 +# 周统计 = 周一 08:00 ~ 次周一 08:00 +# ------------------------------------------------------------------------------ +BUSINESS_DAY_START_HOUR=8 + # ============================================================================== # 统一输出路径配置(export/ 目录) # ============================================================================== @@ -98,8 +106,30 @@ BACKEND_LOG_ROOT=C:/NeoZQYY/export/BACKEND/LOGS # 阿里云百炼 AI 配置 # ------------------------------------------------------------------------------ BAILIAN_API_KEY= +BAILIAN_MODEL=qwen-plus +BAILIAN_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1 BAILIAN_TEST_APP_ID= +# 8 个百炼 AI 应用 ID(从百炼平台获取) +# 应用 1:通用对话 | 应用 2:财务洞察 | 应用 3:客户数据维客线索分析 +# 应用 4:关系分析/任务建议 | 应用 5:话术参考 | 应用 6:备注分析 +# 应用 7:客户分析 | 应用 8:维客线索整理 +BAILIAN_APP_ID_1_CHAT= +BAILIAN_APP_ID_2_FINANCE= +BAILIAN_APP_ID_3_CLUE= +BAILIAN_APP_ID_4_ANALYSIS= +BAILIAN_APP_ID_5_TACTICS= +BAILIAN_APP_ID_6_NOTE= +BAILIAN_APP_ID_7_CUSTOMER= +BAILIAN_APP_ID_8_CONSOLIDATE= + +# ------------------------------------------------------------------------------ +# 管道限流配置(RateLimiter 请求间隔,秒) +# 默认 0.1-2.0s,防止上游风控同时避免过度等待 +# ------------------------------------------------------------------------------ +PIPELINE_RATE_MIN=0.1 +PIPELINE_RATE_MAX=2.0 + # ╔════════════════════════════════════════════════════════════════════════════╗ # ║ [ETL] apps/etl/connectors/feiqiu/.env — ETL 专属配置 ║ # ╚════════════════════════════════════════════════════════════════════════════╝ @@ -234,7 +264,7 @@ DWD_FACT_UPSERT=true # ------------------------------------------------------------------------------ # 任务列表配置 # ------------------------------------------------------------------------------ -RUN_TASKS=PRODUCTS,TABLES,MEMBERS,ASSISTANTS,PACKAGES_DEF,ORDERS,PAYMENTS,REFUNDS,COUPON_USAGE,INVENTORY_CHANGE,TOPUPS,TABLE_DISCOUNT,ASSISTANT_ABOLISH,LEDGER +RUN_TASKS=PRODUCTS,TABLES,MEMBERS,ASSISTANTS,PACKAGES_DEF,ORDERS,PAYMENTS,REFUNDS,COUPON_USAGE,INVENTORY_CHANGE,TOPUPS,TABLE_DISCOUNT,LEDGER # RUN_DWS_TASKS= # RUN_INDEX_TASKS= INDEX_LOOKBACK_DAYS=60 @@ -282,10 +312,12 @@ INDEX_LOOKBACK_DAYS=60 # ------------------------------------------------------------------------------ # 微信小程序配置 +# 代码读取 WX_APPID / WX_SECRET(注意无下划线分隔) +# WX_DEV_MODE=true 时启用 mock 登录端点,跳过微信 code2Session # ------------------------------------------------------------------------------ -# WX_CALLBACK_TOKEN= -# WX_APP_ID= -# WX_APP_SECRET= +# WX_APPID= +# WX_SECRET= +# WX_DEV_MODE=false # ------------------------------------------------------------------------------ # CORS(逗号分隔) @@ -293,6 +325,19 @@ INDEX_LOOKBACK_DAYS=60 # CORS_ORIGINS=http://localhost:5173 # ------------------------------------------------------------------------------ -# ETL 项目路径(子进程 cwd,缺省按 monorepo 相对路径推算) +# ETL 项目路径(子进程 cwd) +# CHANGE 2026-03-06 | 必须显式设置,禁止依赖 __file__ 推算 # ------------------------------------------------------------------------------ -# ETL_PROJECT_PATH=C:/NeoZQYY/apps/etl/connectors/feiqiu \ No newline at end of file +ETL_PROJECT_PATH=C:/NeoZQYY/apps/etl/connectors/feiqiu + +# ------------------------------------------------------------------------------ +# ETL 子进程 Python 可执行路径 +# CHANGE 2026-03-06 | 必须显式设置,避免 PATH 歧义 +# ------------------------------------------------------------------------------ +ETL_PYTHON_EXECUTABLE=C:/NeoZQYY/.venv/Scripts/python.exe + +# ------------------------------------------------------------------------------ +# 运维面板服务器根目录 +# CHANGE 2026-03-06 | 必须显式设置,消除 __file__ 推算风险 +# ------------------------------------------------------------------------------ +OPS_SERVER_BASE=C:/NeoZQYY \ No newline at end of file diff --git a/.gitignore b/.gitignore index e96e94c..14325f3 100644 --- a/.gitignore +++ b/.gitignore @@ -81,5 +81,14 @@ infra/**/*.secret # ===== Kiro 运行时状态 ===== .kiro/.audit_state.json .kiro/.last_prompt_id.json +.kiro/.git_snapshot.json +.kiro/.file_baseline.json +.kiro/.compliance_state.json +.kiro/.audit_context.json + +# ===== 运维脚本运行时状态 ===== +scripts/ops/.monitor_token +# ===== Kiro Powers(含敏感 DSN) ===== +powers/ diff --git a/.kiro/.audit_context.json b/.kiro/.audit_context.json deleted file mode 100644 index b508215..0000000 --- a/.kiro/.audit_context.json +++ /dev/null @@ -1,526 +0,0 @@ -{ - "built_at": "2026-02-26T07:47:14.502315+08:00", - "prompt_id": "P20260226-074515", - "prompt_at": "2026-02-26T07:45:15.438463+08:00", - "audit_required": true, - "db_docs_required": true, - "reasons": [ - "root-file", - "dir:backend", - "dir:etl", - "dir:db", - "db-schema-change", - "dir:shared" - ], - "changed_files": [ - ".gitignore", - "README.md", - "ai-icon-demo-deep-color.png", - "ai-icon-demo-final.png", - "ai-icon-demo-full.png", - "ai-icon-demo-light-border.png", - "ai-icon-demo-light-v1.png", - "ai-icon-demo-light-v2.png", - "ai-icon-demo-v2.png", - "ai-icon-demo-v3.png", - "ai-icon-demo-white-glow.png", - "apps/admin-web/README.md", - "apps/backend/app/auth/dependencies.py", - "apps/backend/app/auth/jwt.py", - "apps/backend/app/main.py", - "apps/backend/app/routers/xcx_auth.py", - "apps/backend/app/schemas/xcx_auth.py", - "apps/backend/app/services/application.py", - "apps/backend/app/services/matching.py", - "apps/backend/app/services/role.py", - "apps/backend/app/services/task_registry.py", - "apps/backend/app/services/wechat.py", - "apps/backend/auth_only.txt", - "apps/backend/auth_only_results.txt", - "apps/backend/auth_test_results.txt", - "apps/backend/docs/", - "apps/backend/test_results.txt", - "apps/etl/connectors/feiqiu/docs/CHANGELOG.md", - "apps/etl/connectors/feiqiu/docs/README.md", - "apps/etl/connectors/feiqiu/docs/business-rules/dws_metrics.md", - "apps/etl/connectors/feiqiu/docs/business-rules/scd2_rules.md", - "apps/etl/connectors/feiqiu/docs/etl_tasks/README.md", - "apps/etl/connectors/feiqiu/docs/etl_tasks/base_task_mechanism.md", - "apps/etl/connectors/feiqiu/docs/etl_tasks/dws_tasks.md", - "apps/etl/connectors/feiqiu/docs/etl_tasks/index_tasks.md", - "apps/etl/connectors/feiqiu/docs/etl_tasks/ods_tasks.md", - "apps/etl/connectors/feiqiu/docs/operations/environment_setup.md", - "apps/etl/connectors/feiqiu/docs/operations/troubleshooting.md", - "apps/etl/connectors/feiqiu/orchestration/flow_runner.py", - "apps/etl/connectors/feiqiu/orchestration/task_executor.py", - "apps/etl/connectors/feiqiu/orchestration/task_registry.py", - "apps/etl/connectors/feiqiu/orchestration/topological_sort.py", - "apps/etl/connectors/feiqiu/quality/consistency_checker.py", - "apps/etl/connectors/feiqiu/scripts/verify_dws_extensions.py", - "apps/etl/connectors/feiqiu/tasks/dwd/dwd_load_task.py", - "apps/etl/connectors/feiqiu/tasks/dws/__init__.py", - "apps/etl/connectors/feiqiu/tasks/dws/assistant_daily_task.py", - "apps/etl/connectors/feiqiu/tasks/dws/assistant_order_contribution_task.py", - "apps/etl/connectors/feiqiu/tasks/dws/member_consumption_task.py", - "apps/etl/connectors/feiqiu/tasks/dws/member_visit_task.py" - ], - "high_risk_files": [ - "apps/backend/app/auth/dependencies.py", - "apps/backend/app/auth/jwt.py", - "apps/backend/app/main.py", - "apps/backend/app/routers/xcx_auth.py", - "apps/backend/app/schemas/xcx_auth.py", - "apps/backend/app/services/application.py", - "apps/backend/app/services/matching.py", - "apps/backend/app/services/role.py", - "apps/backend/app/services/task_registry.py", - "apps/backend/app/services/wechat.py", - "apps/etl/connectors/feiqiu/orchestration/flow_runner.py", - "apps/etl/connectors/feiqiu/orchestration/task_executor.py", - "apps/etl/connectors/feiqiu/orchestration/task_registry.py", - "apps/etl/connectors/feiqiu/orchestration/topological_sort.py", - "apps/etl/connectors/feiqiu/quality/consistency_checker.py", - "apps/etl/connectors/feiqiu/tasks/dwd/dwd_load_task.py", - "apps/etl/connectors/feiqiu/tasks/dws/__init__.py", - "apps/etl/connectors/feiqiu/tasks/dws/assistant_daily_task.py", - "apps/etl/connectors/feiqiu/tasks/dws/assistant_order_contribution_task.py", - "apps/etl/connectors/feiqiu/tasks/dws/member_consumption_task.py", - "apps/etl/connectors/feiqiu/tasks/dws/member_visit_task.py" - ], - "external_files": [ - "db/zqyy_app/migrations/2026-02-25__p3_create_auth_tables.sql", - "db/zqyy_app/migrations/2026-02-25__p3_seed_roles_permissions.sql", - "docs/DOCUMENTATION-MAP.md", - "docs/README.md", - "docs/database/BD_Manual_app_schema_rls_views.md", - "docs/database/BD_Manual_auth_biz_schemas.md", - "docs/database/BD_Manual_auth_tables.md", - "docs/database/BD_Manual_dws_assistant_order_contribution.md", - "docs/database/BD_Manual_dws_member_spending_power_index.md", - "docs/database/BD_Manual_fdw_etl_setup.md", - "docs/database/BD_Manual_goods_stock_warning_info.md", - "docs/database/README.md", - "docs/database/ddl/etl_feiqiu__app.sql", - "docs/database/ddl/etl_feiqiu__core.sql", - "docs/database/ddl/etl_feiqiu__dwd.sql", - "docs/database/ddl/etl_feiqiu__dws.sql", - "docs/database/ddl/etl_feiqiu__meta.sql", - "docs/database/ddl/etl_feiqiu__ods.sql", - "docs/database/ddl/fdw.sql", - "docs/database/ddl/zqyy_app__auth.sql", - "docs/database/ddl/zqyy_app__public.sql", - "docs/deployment/LAUNCH-CHECKLIST.md", - "docs/h5_ui/css/ai-icons.css", - "docs/h5_ui/index.html", - "docs/h5_ui/js/ai-icons.js", - "docs/h5_ui/pages/ai-icon-demo.html", - "docs/h5_ui/pages/board-finance.html", - "docs/h5_ui/pages/customer-detail.html", - "docs/h5_ui/pages/feiqiu-ETL.code-workspace", - "docs/h5_ui/pages/my-profile.html", - "docs/h5_ui/pages/task-detail-callback.html", - "docs/h5_ui/pages/task-detail-priority.html", - "docs/h5_ui/pages/task-detail-relationship.html", - "docs/h5_ui/pages/task-detail.html", - "docs/h5_ui/pages/task-list.html", - "docs/prd/specs/00-/346/225/260/346/215/256/344/276/235/350/265/226/347/237/251/351/230/265.md", - "docs/prd/specs/01-SPEC/344/273/273/345/212/241/346/213/206/345/210/206/346/200/273/350/247/210.md", - "docs/prd/specs/P10-tenant-admin-web.md", - "docs/prd/specs/P2-etl-dws-miniapp-extensions.md", - "docs/prd/specs/P3-miniapp-auth-system.md", - "docs/roadmap/2026-02-24__fdw-dwd-to-core-migration-plan.md", - "docs/roadmap/BACKLOG.md", - "export/ETL-Connectors/feiqiu/JSON/ODS_ASSISTANT_ACCOUNT/ODS_ASSISTANT_ACCOUNT-8931-20260224-002444/", - "export/ETL-Connectors/feiqiu/JSON/ODS_ASSISTANT_ACCOUNT/ODS_ASSISTANT_ACCOUNT-8952-20260224-021528/", - "export/ETL-Connectors/feiqiu/JSON/ODS_ASSISTANT_ACCOUNT/ODS_ASSISTANT_ACCOUNT-8983-20260224-062414/", - "export/ETL-Connectors/feiqiu/JSON/ODS_ASSISTANT_ACCOUNT/ODS_ASSISTANT_ACCOUNT-9012-20260224-122723/", - "export/ETL-Connectors/feiqiu/JSON/ODS_ASSISTANT_ACCOUNT/ODS_ASSISTANT_ACCOUNT-9041-20260225-001547/", - "export/ETL-Connectors/feiqiu/JSON/ODS_ASSISTANT_ACCOUNT/ODS_ASSISTANT_ACCOUNT-9069-20260225-050638/", - "export/ETL-Connectors/feiqiu/JSON/ODS_ASSISTANT_LEDGER/ODS_ASSISTANT_LEDGER-8932-20260224-002450/", - "export/ETL-Connectors/feiqiu/JSON/ODS_ASSISTANT_LEDGER/ODS_ASSISTANT_LEDGER-8953-20260224-021535/", - "export/ETL-Connectors/feiqiu/JSON/ODS_ASSISTANT_LEDGER/ODS_ASSISTANT_LEDGER-8984-20260224-062420/", - "export/ETL-Connectors/feiqiu/JSON/ODS_ASSISTANT_LEDGER/ODS_ASSISTANT_LEDGER-9013-20260224-122730/", - "export/ETL-Connectors/feiqiu/JSON/ODS_ASSISTANT_LEDGER/ODS_ASSISTANT_LEDGER-9042-20260225-001554/", - "export/ETL-Connectors/feiqiu/JSON/ODS_ASSISTANT_LEDGER/ODS_ASSISTANT_LEDGER-9070-20260225-050644/", - "export/ETL-Connectors/feiqiu/JSON/ODS_GOODS_CATEGORY/ODS_GOODS_CATEGORY-8965-20260224-022946/", - "export/ETL-Connectors/feiqiu/JSON/ODS_GOODS_CATEGORY/ODS_GOODS_CATEGORY-8995-20260224-063823/", - "export/ETL-Connectors/feiqiu/JSON/ODS_GOODS_CATEGORY/ODS_GOODS_CATEGORY-9024-20260224-124218/", - "export/ETL-Connectors/feiqiu/JSON/ODS_GOODS_CATEGORY/ODS_GOODS_CATEGORY-9053-20260225-003115/", - "export/ETL-Connectors/feiqiu/JSON/ODS_GOODS_CATEGORY/ODS_GOODS_CATEGORY-9081-20260225-052134/", - "export/ETL-Connectors/feiqiu/JSON/ODS_GROUP_BUY_REDEMPTION/ODS_GROUP_BUY_REDEMPTION-8946-20260224-004949/", - "export/ETL-Connectors/feiqiu/JSON/ODS_GROUP_BUY_REDEMPTION/ODS_GROUP_BUY_REDEMPTION-8971-20260224-023859/", - "export/ETL-Connectors/feiqiu/JSON/ODS_GROUP_BUY_REDEMPTION/ODS_GROUP_BUY_REDEMPTION-9001-20260224-064657/", - "export/ETL-Connectors/feiqiu/JSON/ODS_GROUP_BUY_REDEMPTION/ODS_GROUP_BUY_REDEMPTION-9030-20260224-125319/", - "export/ETL-Connectors/feiqiu/JSON/ODS_GROUP_BUY_REDEMPTION/ODS_GROUP_BUY_REDEMPTION-9059-20260225-004056/", - "export/ETL-Connectors/feiqiu/JSON/ODS_GROUP_BUY_REDEMPTION/ODS_GROUP_BUY_REDEMPTION-9087-20260225-053101/", - "export/ETL-Connectors/feiqiu/JSON/ODS_GROUP_PACKAGE/ODS_GROUP_PACKAGE-8945-20260224-004946/", - "export/ETL-Connectors/feiqiu/JSON/ODS_GROUP_PACKAGE/ODS_GROUP_PACKAGE-8970-20260224-023856/", - "export/ETL-Connectors/feiqiu/JSON/ODS_GROUP_PACKAGE/ODS_GROUP_PACKAGE-9000-20260224-064654/", - "export/ETL-Connectors/feiqiu/JSON/ODS_GROUP_PACKAGE/ODS_GROUP_PACKAGE-9029-20260224-125316/", - "export/ETL-Connectors/feiqiu/JSON/ODS_GROUP_PACKAGE/ODS_GROUP_PACKAGE-9058-20260225-004053/", - "export/ETL-Connectors/feiqiu/JSON/ODS_GROUP_PACKAGE/ODS_GROUP_PACKAGE-9086-20260225-053058/", - "export/ETL-Connectors/feiqiu/JSON/ODS_INVENTORY_CHANGE/ODS_INVENTORY_CHANGE-8973-20260224-024333/", - "export/ETL-Connectors/feiqiu/JSON/ODS_INVENTORY_CHANGE/ODS_INVENTORY_CHANGE-9003-20260224-065054/", - "export/ETL-Connectors/feiqiu/JSON/ODS_INVENTORY_CHANGE/ODS_INVENTORY_CHANGE-9032-20260224-125752/", - "export/ETL-Connectors/feiqiu/JSON/ODS_INVENTORY_CHANGE/ODS_INVENTORY_CHANGE-9061-20260225-004540/", - "export/ETL-Connectors/feiqiu/JSON/ODS_INVENTORY_CHANGE/ODS_INVENTORY_CHANGE-9089-20260225-053508/", - "export/ETL-Connectors/feiqiu/JSON/ODS_INVENTORY_STOCK/ODS_INVENTORY_STOCK-8972-20260224-024323/", - "export/ETL-Connectors/feiqiu/JSON/ODS_INVENTORY_STOCK/ODS_INVENTORY_STOCK-9002-20260224-065046/", - "export/ETL-Connectors/feiqiu/JSON/ODS_INVENTORY_STOCK/ODS_INVENTORY_STOCK-9031-20260224-125742/", - "export/ETL-Connectors/feiqiu/JSON/ODS_INVENTORY_STOCK/ODS_INVENTORY_STOCK-9060-20260225-004531/", - "export/ETL-Connectors/feiqiu/JSON/ODS_INVENTORY_STOCK/ODS_INVENTORY_STOCK-9088-20260225-053459/", - "export/ETL-Connectors/feiqiu/JSON/ODS_MEMBER/ODS_MEMBER-8947-20260224-005404/", - "export/ETL-Connectors/feiqiu/JSON/ODS_MEMBER/ODS_MEMBER-8961-20260224-022706/", - "export/ETL-Connectors/feiqiu/JSON/ODS_MEMBER/ODS_MEMBER-8991-20260224-063552/", - "export/ETL-Connectors/feiqiu/JSON/ODS_MEMBER/ODS_MEMBER-9020-20260224-123913/", - "export/ETL-Connectors/feiqiu/JSON/ODS_MEMBER/ODS_MEMBER-9049-20260225-002816/", - "export/ETL-Connectors/feiqiu/JSON/ODS_MEMBER/ODS_MEMBER-9077-20260225-051851/", - "export/ETL-Connectors/feiqiu/JSON/ODS_MEMBER_BALANCE/ODS_MEMBER_BALANCE-8949-20260224-005446/", - "export/ETL-Connectors/feiqiu/JSON/ODS_MEMBER_BALANCE/ODS_MEMBER_BALANCE-8963-20260224-022749/", - "export/ETL-Connectors/feiqiu/JSON/ODS_MEMBER_BALANCE/ODS_MEMBER_BALANCE-8993-20260224-063631/", - "export/ETL-Connectors/feiqiu/JSON/ODS_MEMBER_BALANCE/ODS_MEMBER_BALANCE-9022-20260224-124004/", - "export/ETL-Connectors/feiqiu/JSON/ODS_MEMBER_BALANCE/ODS_MEMBER_BALANCE-9051-20260225-002911/", - "export/ETL-Connectors/feiqiu/JSON/ODS_MEMBER_BALANCE/ODS_MEMBER_BALANCE-9079-20260225-051934/", - "export/ETL-Connectors/feiqiu/JSON/ODS_MEMBER_CARD/ODS_MEMBER_CARD-8948-20260224-005416/", - "export/ETL-Connectors/feiqiu/JSON/ODS_MEMBER_CARD/ODS_MEMBER_CARD-8962-20260224-022719/", - "export/ETL-Connectors/feiqiu/JSON/ODS_MEMBER_CARD/ODS_MEMBER_CARD-8992-20260224-063604/", - "export/ETL-Connectors/feiqiu/JSON/ODS_MEMBER_CARD/ODS_MEMBER_CARD-9021-20260224-123928/", - "export/ETL-Connectors/feiqiu/JSON/ODS_MEMBER_CARD/ODS_MEMBER_CARD-9050-20260225-002831/", - "export/ETL-Connectors/feiqiu/JSON/ODS_MEMBER_CARD/ODS_MEMBER_CARD-9078-20260225-051904/", - "export/ETL-Connectors/feiqiu/JSON/ODS_PAYMENT/ODS_PAYMENT-8939-20260224-003129/", - "export/ETL-Connectors/feiqiu/JSON/ODS_PAYMENT/ODS_PAYMENT-8956-20260224-021747/", - "export/ETL-Connectors/feiqiu/JSON/ODS_PAYMENT/ODS_PAYMENT-8986-20260224-062624/", - "export/ETL-Connectors/feiqiu/JSON/ODS_PAYMENT/ODS_PAYMENT-9015-20260224-122940/", - "export/ETL-Connectors/feiqiu/JSON/ODS_PAYMENT/ODS_PAYMENT-9044-20260225-001806/", - "export/ETL-Connectors/feiqiu/JSON/ODS_PAYMENT/ODS_PAYMENT-9072-20260225-050857/", - "export/ETL-Connectors/feiqiu/JSON/ODS_PLATFORM_COUPON/ODS_PLATFORM_COUPON-8944-20260224-004106/", - "export/ETL-Connectors/feiqiu/JSON/ODS_PLATFORM_COUPON/ODS_PLATFORM_COUPON-8969-20260224-022959/", - "export/ETL-Connectors/feiqiu/JSON/ODS_PLATFORM_COUPON/ODS_PLATFORM_COUPON-8999-20260224-063835/", - "export/ETL-Connectors/feiqiu/JSON/ODS_PLATFORM_COUPON/ODS_PLATFORM_COUPON-9028-20260224-124234/", - "export/ETL-Connectors/feiqiu/JSON/ODS_PLATFORM_COUPON/ODS_PLATFORM_COUPON-9057-20260225-003128/", - "export/ETL-Connectors/feiqiu/JSON/ODS_PLATFORM_COUPON/ODS_PLATFORM_COUPON-9085-20260225-052147/", - "export/ETL-Connectors/feiqiu/JSON/ODS_RECHARGE_SETTLE/ODS_RECHARGE_SETTLE-8950-20260224-005639/", - "export/ETL-Connectors/feiqiu/JSON/ODS_RECHARGE_SETTLE/ODS_RECHARGE_SETTLE-8964-20260224-022942/", - "export/ETL-Connectors/feiqiu/JSON/ODS_RECHARGE_SETTLE/ODS_RECHARGE_SETTLE-8994-20260224-063819/", - "export/ETL-Connectors/feiqiu/JSON/ODS_RECHARGE_SETTLE/ODS_RECHARGE_SETTLE-9023-20260224-124214/", - "export/ETL-Connectors/feiqiu/JSON/ODS_RECHARGE_SETTLE/ODS_RECHARGE_SETTLE-9052-20260225-003111/", - "export/ETL-Connectors/feiqiu/JSON/ODS_RECHARGE_SETTLE/ODS_RECHARGE_SETTLE-9080-20260225-052130/", - "export/ETL-Connectors/feiqiu/JSON/ODS_REFUND/ODS_REFUND-8940-20260224-003541/", - "export/ETL-Connectors/feiqiu/JSON/ODS_REFUND/ODS_REFUND-8957-20260224-022140/", - "export/ETL-Connectors/feiqiu/JSON/ODS_REFUND/ODS_REFUND-8987-20260224-063015/", - "export/ETL-Connectors/feiqiu/JSON/ODS_REFUND/ODS_REFUND-9016-20260224-123347/", - "export/ETL-Connectors/feiqiu/JSON/ODS_REFUND/ODS_REFUND-9045-20260225-002213/", - "export/ETL-Connectors/feiqiu/JSON/ODS_REFUND/ODS_REFUND-9073-20260225-051302/", - "export/ETL-Connectors/feiqiu/JSON/ODS_SETTLEMENT_RECORDS/ODS_SETTLEMENT_RECORDS-8938-20260224-002943/", - "export/ETL-Connectors/feiqiu/JSON/ODS_SETTLEMENT_RECORDS/ODS_SETTLEMENT_RECORDS-8955-20260224-021559/", - "export/ETL-Connectors/feiqiu/JSON/ODS_SETTLEMENT_RECORDS/ODS_SETTLEMENT_RECORDS-8985-20260224-062445/", - "export/ETL-Connectors/feiqiu/JSON/ODS_SETTLEMENT_RECORDS/ODS_SETTLEMENT_RECORDS-9014-20260224-122754/", - "export/ETL-Connectors/feiqiu/JSON/ODS_SETTLEMENT_RECORDS/ODS_SETTLEMENT_RECORDS-9043-20260225-001622/", - "export/ETL-Connectors/feiqiu/JSON/ODS_SETTLEMENT_RECORDS/ODS_SETTLEMENT_RECORDS-9071-20260225-050710/", - "export/ETL-Connectors/feiqiu/JSON/ODS_STORE_GOODS/ODS_STORE_GOODS-8966-20260224-022948/", - "export/ETL-Connectors/feiqiu/JSON/ODS_STORE_GOODS/ODS_STORE_GOODS-8996-20260224-063825/", - "export/ETL-Connectors/feiqiu/JSON/ODS_STORE_GOODS/ODS_STORE_GOODS-9025-20260224-124220/", - "export/ETL-Connectors/feiqiu/JSON/ODS_STORE_GOODS/ODS_STORE_GOODS-9054-20260225-003118/", - "export/ETL-Connectors/feiqiu/JSON/ODS_STORE_GOODS/ODS_STORE_GOODS-9082-20260225-052136/", - "export/ETL-Connectors/feiqiu/JSON/ODS_STORE_GOODS_SALES/ODS_STORE_GOODS_SALES-8967-20260224-022953/", - "export/ETL-Connectors/feiqiu/JSON/ODS_STORE_GOODS_SALES/ODS_STORE_GOODS_SALES-8997-20260224-063829/", - "export/ETL-Connectors/feiqiu/JSON/ODS_STORE_GOODS_SALES/ODS_STORE_GOODS_SALES-9026-20260224-124227/", - "export/ETL-Connectors/feiqiu/JSON/ODS_STORE_GOODS_SALES/ODS_STORE_GOODS_SALES-9055-20260225-003122/", - "export/ETL-Connectors/feiqiu/JSON/ODS_STORE_GOODS_SALES/ODS_STORE_GOODS_SALES-9083-20260225-052141/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TABLES/ODS_TABLES-8943-20260224-004103/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TABLES/ODS_TABLES-8960-20260224-022703/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TABLES/ODS_TABLES-8990-20260224-063548/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TABLES/ODS_TABLES-9019-20260224-123910/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TABLES/ODS_TABLES-9048-20260225-002812/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TABLES/ODS_TABLES-9076-20260225-051848/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TABLE_FEE_DISCOUNT/ODS_TABLE_FEE_DISCOUNT-8942-20260224-004011/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TABLE_FEE_DISCOUNT/ODS_TABLE_FEE_DISCOUNT-8959-20260224-022559/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TABLE_FEE_DISCOUNT/ODS_TABLE_FEE_DISCOUNT-8989-20260224-063453/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TABLE_FEE_DISCOUNT/ODS_TABLE_FEE_DISCOUNT-9018-20260224-123816/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TABLE_FEE_DISCOUNT/ODS_TABLE_FEE_DISCOUNT-9047-20260225-002706/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TABLE_FEE_DISCOUNT/ODS_TABLE_FEE_DISCOUNT-9075-20260225-051752/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TABLE_USE/ODS_TABLE_USE-8941-20260224-003546/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TABLE_USE/ODS_TABLE_USE-8958-20260224-022143/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TABLE_USE/ODS_TABLE_USE-8988-20260224-063018/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TABLE_USE/ODS_TABLE_USE-9017-20260224-123351/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TABLE_USE/ODS_TABLE_USE-9046-20260225-002216/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TABLE_USE/ODS_TABLE_USE-9074-20260225-051306/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TENANT_GOODS/ODS_TENANT_GOODS-8968-20260224-022954/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TENANT_GOODS/ODS_TENANT_GOODS-8998-20260224-063830/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TENANT_GOODS/ODS_TENANT_GOODS-9027-20260224-124229/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TENANT_GOODS/ODS_TENANT_GOODS-9056-20260225-003124/", - "export/ETL-Connectors/feiqiu/JSON/ODS_TENANT_GOODS/ODS_TENANT_GOODS-9084-20260225-052142/", - "export/ETL-Connectors/feiqiu/REPORTS/blackbox_report_20260220_181225.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_check_20260221_115751.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_check_20260221_120249.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_check_20260221_122116.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_check_20260221_125127.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_check_20260221_130447.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_check_20260221_130620.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_check_20260224_030422.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_check_20260224_030606.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_check_20260224_172522.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_check_20260224_172711.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_check_20260224_173114.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_check_20260224_173244.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_check_20260225_005410.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_check_20260225_005603.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_check_20260225_005756.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_check_20260225_005954.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260220_072152.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260220_072211.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260220_073610.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260220_091414.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260221_153910.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260221_193018.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260221_195222.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260221_200857.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260221_203129.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260221_211445.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260221_212639.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260221_213501.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260221_224027.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260221_225013.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260224_005646.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260224_025039.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260224_065727.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260224_130543.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_005032.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_023232.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_023253.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_023315.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_023339.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_023402.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_023444.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_023506.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_023528.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_023550.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_023612.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_023658.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_023721.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_023744.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_023806.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_023831.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_023913.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_023935.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_023957.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_024019.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_024041.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_024124.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_024146.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_024210.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_024238.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_024307.md", - "export/ETL-Connectors/feiqiu/REPORTS/consistency_report_20260225_053925.md", - "export/ETL-Connectors/feiqiu/REPORTS/context_handoff_task2.md", - "export/ETL-Connectors/feiqiu/REPORTS/ddl_consistency_20260221_212255.md", - "export/ETL-Connectors/feiqiu/REPORTS/ddl_consistency_20260221_212621.md", - "export/ETL-Connectors/feiqiu/REPORTS/ddl_consistency_20260221_212726.md", - "export/ETL-Connectors/feiqiu/REPORTS/dwd_quality_report.json", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260220_072133.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260220_072152.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260220_073610.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260220_091414.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260221_153910.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260221_193018.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260221_195222.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260221_200857.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260221_203129.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260221_211445.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260221_212639.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260221_213501.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260221_224027.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260221_225013.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260224_005646.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260224_025039.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260224_065727.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260224_130543.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_005032.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_023232.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_023253.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_023315.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_023339.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_023402.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_023444.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_023506.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_023528.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_023550.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_023612.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_023658.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_023721.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_023744.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_023806.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_023831.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_023913.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_023935.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_023957.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_024019.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_024041.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_024124.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_024146.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_024210.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_024238.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_024307.md", - "export/ETL-Connectors/feiqiu/REPORTS/etl_timing_20260225_053925.md", - "export/ETL-Connectors/feiqiu/REPORTS/field_level_report_20260220_233100.md", - "export/ETL-Connectors/feiqiu/REPORTS/field_level_report_20260220_233247.md", - "export/ETL-Connectors/feiqiu/REPORTS/field_level_report_20260220_233335.md", - "export/ETL-Connectors/feiqiu/REPORTS/field_level_report_20260220_233432.md", - "export/ETL-Connectors/feiqiu/REPORTS/field_level_report_20260220_233443.md", - "export/SYSTEM/LOGS/2026-02-21__dws_assistant_daily_bug_fix.md", - "export/SYSTEM/LOGS/2026-02-21__etl_full_bug_report.md", - "export/SYSTEM/LOGS/2026-02-21__etl_run_raw.json", - "export/SYSTEM/LOGS/2026-02-21__etl_run_raw_v2.json", - "export/SYSTEM/LOGS/2026-02-21__etl_run_raw_v3.json", - "export/SYSTEM/LOGS/2026-02-21__etl_run_raw_v4.json", - "export/SYSTEM/LOGS/2026-02-21__etl_run_raw_v5.json", - "export/SYSTEM/LOGS/2026-02-21__etl_run_raw_v6.json", - "export/SYSTEM/LOGS/2026-02-21__etl_run_raw_v7.json", - "export/SYSTEM/LOGS/2026-02-21__etl_run_raw_v8.json", - "export/SYSTEM/LOGS/2026-02-21__etl_run_result.md", - "export/SYSTEM/LOGS/2026-02-21__etl_run_result_v2.md", - "export/SYSTEM/LOGS/2026-02-21__etl_run_result_v3.md", - "export/SYSTEM/LOGS/2026-02-21__etl_run_result_v4.md", - "export/SYSTEM/LOGS/2026-02-21__etl_run_result_v5.md", - "export/SYSTEM/LOGS/2026-02-21__etl_run_result_v6.md", - "export/SYSTEM/LOGS/2026-02-21__etl_run_result_v8.md", - "export/SYSTEM/LOGS/2026-02-24__etl_integration_report.md", - "export/debug_errors_extract.txt", - "export/debug_logs_raw.json", - "export/debug_logs_raw.txt", - "export/temp_execution_logs.json", - "export/temp_raw_execution_log.txt", - "export/temp_timing_report.md", - "packages/shared/README.md", - "scripts/audit/gen_audit_dashboard.py", - "scripts/ops/_consistency_output.txt", - "scripts/ops/_run_migrations_20260224.py", - "scripts/ops/_tmp_execution_logs.json", - "scripts/ops/_verify_backend.py", - "scripts/ops/_verify_migration_20260224.py", - "scripts/ops/api_health_check.py", - "scripts/ops/etl_integration_report.py", - "scripts/ops/etl_monitor.py", - "scripts/ops/extract_timing_data.py", - "scripts/ops/fix_assistant_ledger_misdelete.py", - "scripts/ops/gen_consolidated_ddl.py", - "scripts/ops/gen_integration_report.py", - "scripts/ops/run_seed_spi_params.py", - "scripts/ops/validate_p1_db_foundation.py", - "scripts/server/init-server-env.py", - "scripts/server/server-exclude.txt", - "test_p4_output.txt", - "tests/test_auth_system_properties.py", - "tests/test_dws_contribution_properties.py", - "tests/test_etl_refactor_properties.py", - "tests/test_p1_default_privileges.py", - "tests/test_p1_env_missing.py", - "tests/test_p1_fdw_properties.py", - "tests/test_p1_migration_idempotency.py", - "tests/test_p1_migration_structure.py", - "tests/test_p1_rls_filtering.py", - "tests/test_p1_rls_view_properties.py" - ], - "compliance": { - "code_without_docs": [ - { - "file": "apps/backend/app/auth/dependencies.py", - "expected_docs": [ - "apps/backend/docs/API-REFERENCE.md", - "apps/backend/README.md" - ] - }, - { - "file": "apps/backend/app/auth/jwt.py", - "expected_docs": [ - "apps/backend/docs/API-REFERENCE.md", - "apps/backend/README.md" - ] - }, - { - "file": "apps/backend/app/routers/xcx_auth.py", - "expected_docs": [ - "apps/backend/docs/API-REFERENCE.md" - ] - }, - { - "file": "apps/backend/app/services/application.py", - "expected_docs": [ - "apps/backend/docs/API-REFERENCE.md", - "apps/backend/README.md" - ] - }, - { - "file": "apps/backend/app/services/matching.py", - "expected_docs": [ - "apps/backend/docs/API-REFERENCE.md", - "apps/backend/README.md" - ] - }, - { - "file": "apps/backend/app/services/role.py", - "expected_docs": [ - "apps/backend/docs/API-REFERENCE.md", - "apps/backend/README.md" - ] - }, - { - "file": "apps/backend/app/services/task_registry.py", - "expected_docs": [ - "apps/backend/docs/API-REFERENCE.md", - "apps/backend/README.md" - ] - }, - { - "file": "apps/backend/app/services/wechat.py", - "expected_docs": [ - "apps/backend/docs/API-REFERENCE.md", - "apps/backend/README.md" - ] - }, - { - "file": "apps/etl/connectors/feiqiu/orchestration/flow_runner.py", - "expected_docs": [ - "apps/etl/connectors/feiqiu/docs/architecture/" - ] - }, - { - "file": "apps/etl/connectors/feiqiu/orchestration/task_executor.py", - "expected_docs": [ - "apps/etl/connectors/feiqiu/docs/architecture/" - ] - }, - { - "file": "apps/etl/connectors/feiqiu/orchestration/task_registry.py", - "expected_docs": [ - "apps/etl/connectors/feiqiu/docs/architecture/" - ] - }, - { - "file": "apps/etl/connectors/feiqiu/orchestration/topological_sort.py", - "expected_docs": [ - "apps/etl/connectors/feiqiu/docs/architecture/" - ] - } - ], - "new_migration_sql": [ - "db/etl_feiqiu/migrations/2025-02-24__alter_assistant_daily_add_penalty_fields.sql", - "db/etl_feiqiu/migrations/2025-02-24__alter_member_consumption_add_recharge_fields.sql", - "db/etl_feiqiu/migrations/2025-02-24__create_dws_assistant_order_contribution.sql", - "db/etl_feiqiu/migrations/2025-02-24__create_rls_view_assistant_order_contribution.sql", - "db/etl_feiqiu/migrations/2026-02-24__add_goods_stock_warning_info.sql", - "db/etl_feiqiu/migrations/2026-02-24__cleanup_assistant_abolish_residual.sql", - "db/etl_feiqiu/migrations/2026-02-24__p1_create_app_schema_rls_views.sql", - "db/zqyy_app/migrations/2026-02-24__p1_create_auth_biz_schemas.sql", - "db/zqyy_app/migrations/2026-02-24__p1_setup_fdw_etl.sql", - "db/zqyy_app/migrations/2026-02-25__p3_create_auth_tables.sql", - "db/zqyy_app/migrations/2026-02-25__p3_seed_roles_permissions.sql" - ], - "has_bd_manual": true, - "has_audit_record": false, - "has_ddl_baseline": true - }, - "diff_stat": ".gitignore | 14 +-\n .kiro/.last_prompt_id.json | 4 +-\n .kiro/agents/audit-writer.md | 108 +-\n .kiro/hooks/audit-flagger.kiro.hook | 2 +-\n .kiro/hooks/audit-reminder.kiro.hook | 2 +-\n .kiro/hooks/prompt-audit-log.kiro.hook | 2 +-\n .kiro/hooks/run-audit-writer.kiro.hook | 6 +-\n .kiro/specs/etl-fullstack-integration/design.md | 126 -\n .../etl-fullstack-integration/requirements.md | 70 -\n .kiro/specs/etl-fullstack-integration/tasks.md | 86 -\n .kiro/specs/spi-spending-power-index/tasks.md | 14 +-\n README.md | 38 +-\n apps/backend/app/auth/dependencies.py | 99 +-\n apps/backend/app/auth/jwt.py | 44 +-\n apps/backend/app/main.py | 4 +-\n apps/backend/app/services/task_registry.py | 13 +-\n apps/etl/connectors/feiqiu/docs/CHANGELOG.md | 20 +\n apps/etl/connectors/feiqiu/docs/README.md | 4 -\n .../feiqiu/docs/business-rules/dws_metrics.md | 215 +-\n .../feiqiu/docs/business-rules/scd2_rules.md | 130 +-\n .../etl/connectors/feiqiu/docs/etl_tasks/README.md | 19 +-\n .../feiqiu/docs/etl_tasks/base_task_mechanism.md | 4 +-\n .../connectors/feiqiu/docs/etl_tasks/dws_tasks.md | 58 +-\n .../feiqiu/docs/etl_tasks/index_tasks.md | 205 +-\n .../connectors/feiqiu/docs/etl_tasks/ods_tasks.md | 2 +-\n .../feiqiu/docs/operations/environment_setup.md | 6 -\n .../feiqiu/docs/operations/troubleshooting.md | 2 +-\n .../connectors/feiqiu/orchestration/flow_runner.py | 6 +-\n .../feiqiu/orchestration/task_executor.py | 5 +\n .../feiqiu/orchestration/task_registry.py | 5 +-\n .../feiqiu/orchestration/topological_sort.py | 42 +-\n .../feiqiu/quality/consistency_checker.py | 5 +\n .../connectors/feiqiu/tasks/dwd/dwd_load_task.py | 3 +\n apps/etl/connectors/feiqiu/tasks/dws/__init__.py | 2 +\n .../feiqiu/tasks/dws/assistant_daily_task.py | 205 +-\n .../feiqiu/tasks/dws/member_consumption_task.py | 91 +-\n .../feiqiu/tasks/dws/member_visit_task.py | 2 +\n apps/etl/connectors/feiqiu/tasks/ods/ods_tasks.py | 71 +-\n .../feiqiu/tests/unit/test_topological_sort.py | 86 +-\n apps/miniprogram/README.md | 135 +-\n db/README.md | 21 +-\n .../db/etl_feiqiu/schemas/dwd.sql | 3 +\n .../db/etl_feiqiu/schemas/ods.sql | 3 +\n db/etl_feiqiu/seeds/seed_ods_tasks.sql | 2 +-\n db/etl_feiqiu/seeds/seed_scheduler_tasks.sql | 2 +-\n docs/README.md | 166 +-\n docs/audit/audit_dashboard.md | 5 +-\n docs/database/README.md | 15 +-\n docs/database/ddl/etl_feiqiu__app.sql | 1070 ++++++-\n docs/database/ddl/etl_feiqiu__core.sql | 2 +-\n docs/database/ddl/etl_feiqiu__dwd.sql | 7 +-\n docs/database/ddl/etl_feiqiu__dws.sql | 38 +-\n docs/database/ddl/etl_feiqiu__meta.sql | 2 +-\n docs/database/ddl/etl_feiqiu__ods.sql | 7 +-\n docs/database/ddl/fdw.sql | 2 +-\n docs/database/ddl/zqyy_app__public.sql | 2 +-\n docs/deployment/LAUNCH-CHECKLIST.md | 22 +-\n docs/h5_ui/index.html | 16 +\n docs/h5_ui/pages/board-finance.html | 36 +-\n docs/h5_ui/pages/customer-detail.html | 6 +-\n docs/h5_ui/pages/feiqiu-ETL.code-workspace | 13 -\n docs/h5_ui/pages/my-profile.html | 15 -\n docs/h5_ui/pages/task-detail-callback.html | 47 +-\n docs/h5_ui/pages/task-detail-priority.html | 47 +-\n docs/h5_ui/pages/task-detail-relationship.html | 41 +-\n docs/h5_ui/pages/task-detail.html | 110 +-\n docs/h5_ui/pages/task-list.html | 17 +-\n ...276\\235\\350\\265\\226\\347\\237\\251\\351\\230\\265.md\" | 11 +-\n ...213\\206\\345\\210\\206\\346\\200\\273\\350\\247\\210.md\" | 5 +-\n docs/prd/specs/P10-tenant-admin-web.md | 4 +-\n docs/prd/specs/P2-etl-dws-miniapp-extensions.md | 8 +-\n docs/prd/specs/P3-miniapp-auth-system.md | 15 +-\n .../REPORTS/blackbox_report_20260220_181225.md | 183 --\n .../REPORTS/consistency_check_20260221_115751.md | 785 -----\n .../REPORTS/consistency_check_20260221_120249.md | 1851 ------------\n .../REPORTS/consistency_check_20260221_122116.md | 1851 ------------\n .../REPORTS/consistency_check_20260221_125127.md | 2001 -------------\n .../REPORTS/consistency_check_20260221_130447.md | 2005 -------------\n .../REPORTS/consistency_check_20260221_130620.md | 2003 -------------\n .../REPORTS/consistency_report_20260220_072152.md | 335 ---\n .../REPORTS/consistency_report_20260220_072211.md | 335 ---\n .../REPORTS/consistency_report_20260220_073610.md | 335 ---\n .../REPORTS/consistency_report_20260220_091414.md | 335 ---\n .../REPORTS/consistency_report_20260221_153910.md | 125 -\n .../REPORTS/consistency_report_20260221_193018.md | 125 -\n .../REPORTS/consistency_report_20260221_195222.md | 125 -\n .../REPORTS/consistency_report_20260221_200857.md | 125 -\n .../REPORTS/consistency_report_20260221_203129.md | 125 -\n .../REPORTS/consistency_report_20260221_211445.md | 125 -\n .../REPORTS/consistency_report_20260221_212639.md | 125 -\n .../REPORTS/consistency_report_20260221_213501.md | 125 -\n .../REPORTS/consistency_report_20260221_224027.md | 88 -\n .../REPORTS/consistency_report_20260221_225013.md | 88 -\n .../feiqiu/REPORTS/context_handoff_task2.md | 57 -\n .../REPORTS/ddl_consistency_20260221_212255.md | 152 -\n .../REPORTS/ddl_consistency_20260221_212621.md | 133 -\n .../REPORTS/ddl_consistency_20260221_212726.md | 131 -\n .../feiqiu/REPORTS/dwd_quality_report.json | 783 -----\n .../feiqiu/REPORTS/etl_timing_20260220_072133.md | 16 -\n .../feiqiu/REPORTS/etl_timing_20260220_072152.md | 16 -\n .../feiqiu/REPORTS/etl_timing_20260220_073610.md | 13 -\n .../feiqiu/REPORTS/etl_timing_20260220_091414.md | 13 -\n .../feiqiu/REPORTS/etl_timing_20260221_153910.md | 13 -\n .../feiqiu/REPORTS/etl_timing_20260221_193018.md | 13 -\n .../feiqiu/REPORTS/etl_timing_20260221_195222.md | 13 -\n .../feiqiu/REPORTS/etl_timing_20260221_200857.md | 13 -\n .../feiqiu/REPORTS/etl_timing_20260221_203129.md | 13 -\n .../feiqiu/REPORTS/etl_timing_20260221_211445.md | 13 -\n .../feiqiu/REPORTS/etl_timing_20260221_212639.md | 13 -\n .../feiqiu/REPORTS/etl_timing_20260221_213501.md | 13 -\n .../feiqiu/REPORTS/etl_timing_20260221_224027.md | 13 -\n .../feiqiu/REPORTS/etl_timing_20260221_225013.md | 13 -\n .../REPORTS/field_level_report_20260220_233100.md | 2749 -----------------\n .../REPORTS/field_level_report_20260220_233247.md | 2973 -------------------\n .../REPORTS/field_level_report_20260220_233335.md | 3111 --------------------\n .../REPORTS/field_level_report_20260220_233432.md | 3111 --------------------\n .../REPORTS/field_level_report_20260220_233443.md | 3111 --------------------\n .../2026-02-21__dws_assistant_daily_bug_fix.md | 178 --\n .../SYSTEM/LOGS/2026-02-21__etl_full_bug_report.md | 443 ---\n export/SYSTEM/LOGS/2026-02-21__etl_run_raw.json | 428 ---\n export/SYSTEM/LOGS/2026-02-21__etl_run_raw_v2.json | 50 -\n export/SYSTEM/LOGS/2026-02-21__etl_run_raw_v3.json | 5 -\n export/SYSTEM/LOGS/2026-02-21__etl_run_raw_v4.json | 5 -\n export/SYSTEM/LOGS/2026-02-21__etl_run_raw_v5.json | 5 -\n export/SYSTEM/LOGS/2026-02-21__etl_run_raw_v6.json | 5 -\n export/SYSTEM/LOGS/2026-02-21__etl_run_raw_v7.json | 5 -\n export/SYSTEM/LOGS/2026-02-21__etl_run_raw_v8.json | 5 -\n export/SYSTEM/LOGS/2026-02-21__etl_run_result.md | 124 -\n .../SYSTEM/LOGS/2026-02-21__etl_run_result_v2.md | 136 -\n .../SYSTEM/LOGS/2026-02-21__etl_run_result_v3.md | 85 -\n .../SYSTEM/LOGS/2026-02-21__etl_run_result_v4.md | 70 -\n .../SYSTEM/LOGS/2026-02-21__etl_run_result_v5.md | 69 -\n .../SYSTEM/LOGS/2026-02-21__etl_run_result_v6.md | 59 -\n .../SYSTEM/LOGS/2026-02-21__etl_run_result_v8.md | 109 -\n scripts/audit/gen_audit_dashboard.py | 2 +-\n scripts/ops/gen_consolidated_ddl.py | 4 +-\n scripts/server/server-exclude.txt | 2 +-\n tests/test_etl_refactor_properties.py | 1 +\n 138 files changed, 2949 insertions(+), 32250 deletions(-)", - "high_risk_diff": "diff --git a/apps/backend/app/auth/dependencies.py b/apps/backend/app/auth/dependencies.py\nindex 5e4757f..a015fe6 100644\n--- a/apps/backend/app/auth/dependencies.py\n+++ b/apps/backend/app/auth/dependencies.py\n@@ -5,9 +5,15 @@ FastAPI 依赖注入:从 JWT 提取当前用户信息。\n @router.get(\"/protected\")\n async def protected_endpoint(user: CurrentUser = Depends(get_current_user)):\n print(user.user_id, user.site_id)\n+\n+ # 允许 pending 用户(受限令牌)访问\n+ @router.get(\"/apply\")\n+ async def apply_endpoint(user: CurrentUser = Depends(get_current_user_or_limited)):\n+ if user.limited:\n+ ... # 受限逻辑\n \"\"\"\n \n-from dataclasses import dataclass\n+from dataclasses import dataclass, field\n \n from fastapi import Depends, HTTPException, status\n from fastapi.security import HTTPAuthorizationCredentials, HTTPBearer\n@@ -24,7 +30,10 @@ class CurrentUser:\n \"\"\"从 JWT 解析出的当前用户上下文。\"\"\"\n \n user_id: int\n- site_id: int\n+ site_id: int = 0\n+ roles: list[str] = field(default_factory=list)\n+ status: str = \"pending\"\n+ limited: bool = False\n \n \n async def get_current_user(\n@@ -33,7 +42,7 @@ async def get_current_user(\n \"\"\"\n FastAPI 依赖:从 Authorization header 提取 JWT,验证后返回用户信息。\n \n- 失败时抛出 401。\n+ 要求完整令牌(非 limited),失败时抛出 401。\n \"\"\"\n token = credentials.credentials\n try:\n@@ -45,6 +54,14 @@ async def get_current_user(\n headers={\"WWW-Authenticate\": \"Bearer\"},\n )\n \n+ # 受限令牌不允许通过此依赖\n+ if payload.get(\"limited\"):\n+ raise HTTPException(\n+ status_code=status.HTTP_401_UNAUTHORIZED,\n+ detail=\"受限令牌无法访问此端点\",\n+ headers={\"WWW-Authenticate\": \"Bearer\"},\n+ )\n+\n user_id_raw = payload.get(\"sub\")\n site_id = payload.get(\"site_id\")\n \n@@ -64,4 +81,78 @@ async def get_current_user(\n headers={\"WWW-Authenticate\": \"Bearer\"},\n )\n \n- return CurrentUser(user_id=user_id, site_id=site_id)\n+ roles = payload.get(\"roles\", [])\n+\n+ return CurrentUser(\n+ user_id=user_id,\n+ site_id=site_id,\n+ roles=roles,\n+ status=\"approved\",\n+ limited=False,\n+ )\n+\n+\n+async def get_current_user_or_limited(\n+ credentials: HTTPAuthorizationCredentials = Depends(_bearer_scheme),\n+) -> CurrentUser:\n+ \"\"\"\n+ FastAPI 依赖:允许 pending 用户(受限令牌)访问。\n+\n+ - 受限令牌(limited=True):返回 CurrentUser(limited=True, roles=[], status=\"pending\")\n+ - 完整令牌:正常返回 CurrentUser\n+ \"\"\"\n+ token = credentials.credentials\n+ try:\n+ payload = decode_access_token(token)\n+ except JWTError:\n+ raise HTTPException(\n+ status_code=status.HTTP_401_UNAUTHORIZED,\n+ detail=\"无效的令牌\",\n+ headers={\"WWW-Authenticate\": \"Bearer\"},\n+ )\n+\n+ user_id_raw = payload.get(\"sub\")\n+ if user_id_raw is None:\n+ raise HTTPException(\n+ status_code=status.HTTP_401_UNAUTHORIZED,\n+ detail=\"令牌缺少必要字段\",\n+ headers={\"WWW-Authenticate\": \"Bearer\"},\n+ )\n+\n+ try:\n+ user_id = int(user_id_raw)\n+ except (TypeError, ValueError):\n+ raise HTTPException(\n+ status_code=status.HTTP_401_UNAUTHORIZED,\n+ detail=\"令牌中 user_id 格式无效\",\n+ headers={\"WWW-Authenticate\": \"Bearer\"},\n+ )\n+\n+ # 受限令牌:pending 用户\n+ if payload.get(\"limited\"):\n+ return CurrentUser(\n+ user_id=user_id,\n+ site_id=0,\n+ roles=[],\n+ status=\"pending\",\n+ limited=True,\n+ )\n+\n+ # 完整令牌:要求 site_id\n+ site_id = payload.get(\"site_id\")\n+ if site_id is None:\n+ raise HTTPException(\n+ status_code=status.HTTP_401_UNAUTHORIZED,\n+ detail=\"令牌缺少必要字段\",\n+ headers={\"WWW-Authenticate\": \"Bearer\"},\n+ )\n+\n+ roles = payload.get(\"roles\", [])\n+\n+ return CurrentUser(\n+ user_id=user_id,\n+ site_id=site_id,\n+ roles=roles,\n+ status=\"approved\",\n+ limited=False,\n+ )\ndiff --git a/apps/backend/app/auth/jwt.py b/apps/backend/app/auth/jwt.py\nindex 5be3acc..227b4be 100644\n--- a/apps/backend/app/auth/jwt.py\n+++ b/apps/backend/app/auth/jwt.py\n@@ -27,11 +27,14 @@ def hash_password(password: str) -> str:\n return bcrypt.hashpw(password.encode(\"utf-8\"), bcrypt.gensalt()).decode(\"utf-8\")\n \n \n-def create_access_token(user_id: int, site_id: int) -> str:\n+def create_access_token(\n+ user_id: int, site_id: int, roles: list[str] | None = None\n+) -> str:\n \"\"\"\n 生成 access_token。\n \n- payload: sub=user_id, site_id, type=access, exp\n+ payload: sub=user_id, site_id, roles, type=access, exp\n+ roles 参数默认 None,保持向后兼容。\n \"\"\"\n expire = datetime.now(timezone.utc) + timedelta(\n minutes=config.JWT_ACCESS_TOKEN_EXPIRE_MINUTES\n@@ -42,6 +45,8 @@ def create_access_token(user_id: int, site_id: int) -> str:\n \"type\": \"access\",\n \"exp\": expire,\n }\n+ if roles is not None:\n+ payload[\"roles\"] = roles\n return jwt.encode(payload, config.JWT_SECRET_KEY, algorithm=config.JWT_ALGORITHM)\n \n \n@@ -63,15 +68,46 @@ def create_refresh_token(user_id: int, site_id: int) -> str:\n return jwt.encode(payload, config.JWT_SECRET_KEY, algorithm=config.JWT_ALGORITHM)\n \n \n-def create_token_pair(user_id: int, site_id: int) -> dict[str, str]:\n+def create_token_pair(user_id: int, site_id: int, roles: list[str] | None = None) -> dict[str, str]:\n \"\"\"生成 access_token + refresh_token 令牌对。\"\"\"\n return {\n- \"access_token\": create_access_token(user_id, site_id),\n+ \"access_token\": create_access_token(user_id, site_id, roles=roles),\n \"refresh_token\": create_refresh_token(user_id, site_id),\n \"token_type\": \"bearer\",\n }\n \n \n+def create_limited_token_pair(user_id: int) -> dict[str, str]:\n+ \"\"\"\n+ 为 pending 用户签发受限令牌。\n+\n+ payload 不含 site_id 和 roles,仅包含 user_id + type + limited=True。\n+ 受限令牌仅允许访问申请提交和状态查询端点。\n+ \"\"\"\n+ now = datetime.now(timezone.utc)\n+ access_payload = {\n+ \"sub\": str(user_id),\n+ \"type\": \"access\",\n+ \"limited\": True,\n+ \"exp\": now + timedelta(minutes=config.JWT_ACCESS_TOKEN_EXPIRE_MINUTES),\n+ }\n+ refresh_payload = {\n+ \"sub\": str(user_id),\n+ \"type\": \"refresh\",\n+ \"limited\": True,\n+ \"exp\": now + timedelta(days=config.JWT_REFRESH_TOKEN_EXPIRE_DAYS),\n+ }\n+ return {\n+ \"access_token\": jwt.encode(\n+ access_payload, config.JWT_SECRET_KEY, algorithm=config.JWT_ALGORITHM\n+ ),\n+ \"refresh_token\": jwt.encode(\n+ refresh_payload, config.JWT_SECRET_KEY, algorithm=config.JWT_ALGORITHM\n+ ),\n+ \"token_type\": \"bearer\",\n+ }\n+\n+\n def decode_token(token: str) -> dict:\n \"\"\"\n 解码并验证 JWT 令牌。\ndiff --git a/apps/backend/app/main.py b/apps/backend/app/main.py\nindex 5aa5df2..bab0063 100644\n--- a/apps/backend/app/main.py\n+++ b/apps/backend/app/main.py\n@@ -14,7 +14,8 @@ from app import config\n # CHANGE 2026-02-19 | 新增 xcx_test 路由(MVP 验证)+ wx_callback 路由(微信消息推送)\n # CHANGE 2026-02-22 | 新增 member_birthday 路由(助教手动补录会员生日)\n # CHANGE 2026-02-23 | 新增 ops_panel 路由(运维控制面板)\n-from app.routers import auth, execution, schedules, tasks, env_config, db_viewer, etl_status, xcx_test, wx_callback, member_birthday, ops_panel\n+# CHANGE 2026-02-25 | 新增 xcx_auth 路由(小程序微信登录 + 申请 + 状态查询 + 店铺切换)\n+from app.routers import auth, execution, schedules, tasks, env_config, db_viewer, etl_status, xcx_test, wx_callback, member_birthday, ops_panel, xcx_auth\n from app.services.scheduler import scheduler\n from app.services.task_queue import task_queue\n from app.ws.logs import ws_router\n@@ -64,6 +65,7 @@ app.include_router(xcx_test.router)\n app.include_router(wx_callback.router)\n app.include_router(member_birthday.router)\n app.include_router(ops_panel.router)\n+app.include_router(xcx_auth.router)\n \n \n @app.get(\"/health\", tags=[\"系统\"])\ndiff --git a/apps/backend/app/services/task_registry.py b/apps/backend/app/services/task_registry.py\nindex 118ca43..c271e30 100644\n--- a/apps/backend/app/services/task_registry.py\n+++ b/apps/backend/app/services/task_registry.py\n@@ -44,7 +44,7 @@ class DwdTableDefinition:\n ODS_TASKS: list[TaskDefinition] = [\n TaskDefinition(\"ODS_ASSISTANT_ACCOUNT\", \"助教账号\", \"抽取助教账号主数据\", \"助教\", \"ODS\", is_ods=True),\n TaskDefinition(\"ODS_ASSISTANT_LEDGER\", \"助教服务记录\", \"抽取助教服务流水\", \"助教\", \"ODS\", is_ods=True),\n- TaskDefinition(\"ODS_ASSISTANT_ABOLISH\", \"助教取消记录\", \"抽取助教取消/作废记录\", \"助教\", \"ODS\", is_ods=True),\n+\n TaskDefinition(\"ODS_SETTLEMENT_RECORDS\", \"结算记录\", \"抽取订单结算记录\", \"结算\", \"ODS\", is_ods=True),\n # CHANGE [2026-07-20] intent: 同步 ETL 侧移除——ODS_SETTLEMENT_TICKET 已在 Task 7.3 中彻底移除\n TaskDefinition(\"ODS_TABLE_USE\", \"台费流水\", \"抽取台费使用流水\", \"台桌\", \"ODS\", is_ods=True),\n@@ -65,6 +65,7 @@ ODS_TASKS: list[TaskDefinition] = [\n TaskDefinition(\"ODS_STORE_GOODS\", \"门店商品\", \"抽取门店商品主数据\", \"商品\", \"ODS\", is_ods=True, requires_window=False),\n TaskDefinition(\"ODS_STORE_GOODS_SALES\", \"商品销售\", \"抽取门店商品销售记录\", \"商品\", \"ODS\", is_ods=True),\n TaskDefinition(\"ODS_TENANT_GOODS\", \"租户商品\", \"抽取租户级商品主数据\", \"商品\", \"ODS\", is_ods=True, requires_window=False),\n+ TaskDefinition(\"ODS_STAFF_INFO\", \"员工档案\", \"抽取员工档案(含在职/离职)\", \"助教\", \"ODS\", is_ods=True, requires_window=False),\n ]\n \n # ── DWD 任务定义 ──────────────────────────────────────────────\n@@ -105,18 +106,17 @@ INDEX_TASKS: list[TaskDefinition] = [\n TaskDefinition(\"DWS_ML_MANUAL_IMPORT\", \"手动导入 (ML)\", \"手动导入机器学习数据\", \"指数\", \"INDEX\", requires_window=False, is_common=False),\n # CHANGE [2026-02-19] intent: 补充说明 RelationIndexTask 产出 RS/OS/MS/ML 四个子指数\n TaskDefinition(\"DWS_RELATION_INDEX\", \"关系指数 (RS)\", \"产出 RS/OS/MS/ML 四个子指数\", \"指数\", \"INDEX\"),\n+ TaskDefinition(\"DWS_SPENDING_POWER_INDEX\", \"消费力指数 (SPI)\", \"计算会员消费力指数\", \"指数\", \"INDEX\"),\n ]\n \n # ── 工具类任务定义 ────────────────────────────────────────────\n \n UTILITY_TASKS: list[TaskDefinition] = [\n TaskDefinition(\"MANUAL_INGEST\", \"手动导入\", \"从本地 JSON 文件手动导入数据\", \"工具\", \"UTILITY\", requires_window=False, is_common=False),\n- TaskDefinition(\"INIT_ODS_SCHEMA\", \"初始化 ODS Schema\", \"创建 ODS 层表结构\", \"工具\", \"UTILITY\", requires_window=False, is_common=False),\n- TaskDefinition(\"INIT_DWD_SCHEMA\", \"初始化 DWD Schema\", \"创建 DWD 层表结构\", \"工具\", \"UTILITY\", requires_window=False, is_common=False),\n- TaskDefinition(\"INIT_DWS_SCHEMA\", \"初始化 DWS Schema\", \"创建 DWS 层表结构\", \"工具\", \"UTILITY\", requires_window=False, is_common=False),\n+ # CHANGE [2026-02-24] intent: 移除 4 个一次性初始化任务(INIT_ODS/DWD/DWS_SCHEMA、SEED_DWS_CONFIG),\n+ # 环境已搭建完成,仅保留 ETL 侧实现供运维脚本直接 import 使用,UI 不再展示\n TaskDefinition(\"ODS_JSON_ARCHIVE\", \"ODS JSON 归档\", \"归档 ODS 原始 JSON 文件\", \"工具\", \"UTILITY\", requires_window=False, is_common=False),\n TaskDefinition(\"CHECK_CUTOFF\", \"游标检查\", \"检查各任务数据游标截止点\", \"工具\", \"UTILITY\", requires_window=False, is_common=False),\n- TaskDefinition(\"SEED_DWS_CONFIG\", \"DWS 配置种子\", \"初始化 DWS 配置数据\", \"工具\", \"UTILITY\", requires_window=False, is_common=False),\n TaskDefinition(\"DATA_INTEGRITY_CHECK\", \"数据完整性校验\", \"校验跨层数据完整性\", \"工具\", \"UTILITY\", requires_window=False, is_common=False),\n ]\n \n@@ -202,8 +202,7 @@ DWD_TABLES: list[DwdTableDefinition] = [\n DwdTableDefinition(\"dwd.dwd_store_goods_sale_ex\", \"商品销售(扩展)\", \"商品\", \"ods.store_goods_sales_records\"),\n DwdTableDefinition(\"dwd.dwd_assistant_service_log\", \"助教服务流水\", \"助教\", \"ods.assistant_service_records\"),\n DwdTableDefinition(\"dwd.dwd_assistant_service_log_ex\", \"助教服务流水(扩展)\", \"助教\", \"ods.assistant_service_records\"),\n- DwdTableDefinition(\"dwd.dwd_assistant_trash_event\", \"助教取消事件\", \"助教\", \"ods.assistant_cancellation_records\"),\n- DwdTableDefinition(\"dwd.dwd_assistant_trash_event_ex\", \"助教取消事件(扩展)\", \"助教\", \"ods.assistant_cancellation_records\"),\n+ # CHANGE [2026-02-24] intent: 移除已废弃的 assistant_trash_event 表定义(ODS_ASSISTANT_ABOLISH 全链路已清理)\n DwdTableDefinition(\"dwd.dwd_member_balance_change\", \"会员余额变动\", \"会员\", \"ods.member_balance_changes\"),\n DwdTableDefinition(\"dwd.dwd_member_balance_change_ex\", \"会员余额变动(扩展)\", \"会员\", \"ods.member_balance_changes\"),\n DwdTableDefinition(\"dwd.dwd_groupbuy_redemption\", \"团购核销\", \"团购\", \"ods.group_buy_redemption_records\"),\ndiff --git a/apps/etl/connectors/feiqiu/orchestration/flow_runner.py b/apps/etl/connectors/feiqiu/orchestration/flow_runner.py\nindex 8351b32..3fd8032 100644\n--- a/apps/etl/connectors/feiqiu/orchestration/flow_runner.py\n+++ b/apps/etl/connectors/feiqiu/orchestration/flow_runner.py\n@@ -166,7 +166,11 @@ class FlowRunner:\n \n timer.start_step(\"INCREMENT_ETL\")\n if task_codes:\n- results = self.task_executor.run_tasks(task_codes, data_source=data_source)\n+ # CHANGE [2026-02-24] intent: 对前端传入的 task_codes 也执行拓扑排序,\n+ # 避免 DWS 在 DWD 未完成时就开始计算(跨层依赖顺序缺失 bug)\n+ # prompt: \"修复管理后台全选任务时不按层级顺序执行的问题\"\n+ sorted_codes = topological_sort(task_codes, self.task_registry)\n+ results = self.task_executor.run_tasks(sorted_codes, data_source=data_source)\n else:\n auto_tasks = self._resolve_tasks(layers)\n results = self.task_executor.run_tasks(auto_tasks, data_source=data_source)\ndiff --git a/apps/etl/connectors/feiqiu/orchestration/task_executor.py b/apps/etl/connectors/feiqiu/orchestration/task_executor.py\nindex 0d68f44..142b24c 100644\n--- a/apps/etl/connectors/feiqiu/orchestration/task_executor.py\n+++ b/apps/etl/connectors/feiqiu/orchestration/task_executor.py\n@@ -107,6 +107,11 @@ class TaskExecutor:\n results.append(result_entry)\n except Exception as exc: # noqa: BLE001\n self.logger.error(\"任务 %s 失败: %s\", task_code, exc, exc_info=True)\n+ # CHANGE 2026-02-24 | 任务失败后 rollback,防止 InFailedSqlTransaction 级联\n+ try:\n+ self.db.rollback()\n+ except Exception:\n+ pass\n results.append({\n \"task_code\": task_code,\n \"status\": \"失败\",\ndiff --git a/apps/etl/connectors/feiqiu/orchestration/task_registry.py b/apps/etl/connectors/feiqiu/orchestration/task_registry.py\nindex b6499b9..38fa90b 100644\n--- a/apps/etl/connectors/feiqiu/orchestration/task_registry.py\n+++ b/apps/etl/connectors/feiqiu/orchestration/task_registry.py\n@@ -30,6 +30,7 @@ from tasks.utility.seed_dws_config_task import SeedDwsConfigTask\n # DWS 层任务导入\n from tasks.dws import (\n AssistantDailyTask,\n+ AssistantOrderContributionTask,\n AssistantMonthlyTask,\n AssistantCustomerTask,\n AssistantSalaryTask,\n@@ -147,6 +148,7 @@ default_registry.register(\"DATA_INTEGRITY_CHECK\", DataIntegrityTask, requires_db\n # ── DWS 层业务任务 ────────────────────────────────────────────\n default_registry.register(\"DWS_BUILD_ORDER_SUMMARY\", DwsBuildOrderSummaryTask, requires_db_config=False, layer=\"DWS\")\n default_registry.register(\"DWS_ASSISTANT_DAILY\", AssistantDailyTask, layer=\"DWS\")\n+default_registry.register(\"DWS_ASSISTANT_ORDER_CONTRIBUTION\", AssistantOrderContributionTask, layer=\"DWS\", depends_on=[\"DWD_LOAD_FROM_ODS\"])\n # CHANGE [2026-07-17] intent: 为已知依赖关系添加 depends_on 声明(需求 8.1, 8.2)\n default_registry.register(\"DWS_ASSISTANT_MONTHLY\", AssistantMonthlyTask, layer=\"DWS\", depends_on=[\"DWS_ASSISTANT_DAILY\"])\n default_registry.register(\"DWS_ASSISTANT_CUSTOMER\", AssistantCustomerTask, layer=\"DWS\")\n@@ -166,7 +168,8 @@ default_registry.register(\"DWS_GOODS_STOCK_MONTHLY\", GoodsStockMonthlyTask, laye\n # 替换为统一维护任务 DWS_MAINTENANCE(需求 4.5)\n # depends_on: 所有其他 DWS 任务——MV 刷新和清理应在数据写入后执行\n default_registry.register(\"DWS_MAINTENANCE\", DwsMaintenanceTask, layer=\"DWS\", depends_on=[\n- \"DWS_ASSISTANT_DAILY\", \"DWS_ASSISTANT_MONTHLY\", \"DWS_ASSISTANT_CUSTOMER\",\n+ \"DWS_ASSISTANT_DAILY\", \"DWS_ASSISTANT_ORDER_CONTRIBUTION\",\n+ \"DWS_ASSISTANT_MONTHLY\", \"DWS_ASSISTANT_CUSTOMER\",\n \"DWS_ASSISTANT_SALARY\", \"DWS_ASSISTANT_FINANCE\",\n \"DWS_MEMBER_CONSUMPTION\", \"DWS_MEMBER_VISIT\",\n \"DWS_FINANCE_DAILY\", \"DWS_FINANCE_RECHARGE\",\ndiff --git a/apps/etl/connectors/feiqiu/orchestration/topological_sort.py b/apps/etl/connectors/feiqiu/orchestration/topological_sort.py\nindex e6dc081..bc18098 100644\n--- a/apps/etl/connectors/feiqiu/orchestration/topological_sort.py\n+++ b/apps/etl/connectors/feiqiu/orchestration/topological_sort.py\n@@ -2,6 +2,8 @@\n \"\"\"拓扑排序模块 — Kahn's algorithm\n \n 对任务列表按依赖关系执行拓扑排序:\n+- 显式依赖:TaskMeta.depends_on 声明的任务间依赖\n+- 隐含层级依赖:ODS → DWD → DWS → INDEX,同批任务中低层任务必须先于高层任务\n - 仅对当前执行列表内的任务排序\n - depends_on 中引用的任务不在列表内时记录警告\n - 检测循环依赖并抛出 ValueError\n@@ -11,10 +13,22 @@ import logging\n \n logger = logging.getLogger(__name__)\n \n+# 层级优先级:数值越小越先执行\n+_LAYER_ORDER: dict[str, int] = {\n+ \"ODS\": 0,\n+ \"DWD\": 1,\n+ \"DWS\": 2,\n+ \"INDEX\": 3,\n+}\n+\n \n def topological_sort(task_codes: list[str], registry) -> list[str]:\n \"\"\"对任务列表执行拓扑排序(Kahn's algorithm)。\n \n+ 除了显式 depends_on 依赖外,还注入隐含的层级依赖:\n+ 同批任务中,所有 ODS 任务排在 DWD 之前,DWD 排在 DWS 之前,\n+ DWS 排在 INDEX 之前。这确保跨层执行顺序正确。\n+\n Args:\n task_codes: 待排序的任务代码列表\n registry: TaskRegistry 实例,提供 get_metadata() 查询依赖\n@@ -29,9 +43,10 @@ def topological_sort(task_codes: list[str], registry) -> list[str]:\n return []\n \n in_degree = {code: 0 for code in task_codes}\n- graph = {code: [] for code in task_codes}\n+ graph: dict[str, list[str]] = {code: [] for code in task_codes}\n task_set = set(task_codes)\n \n+ # 1. 显式依赖(depends_on)\n for code in task_codes:\n meta = registry.get_metadata(code)\n if meta and meta.depends_on:\n@@ -44,6 +59,31 @@ def topological_sort(task_codes: list[str], registry) -> list[str]:\n \"任务 %s 依赖 %s,但后者不在当前执行列表中\", code, dep\n )\n \n+ # CHANGE [2026-02-24] intent: 注入隐含层级依赖,确保跨层执行顺序正确\n+ # assumptions: 层级顺序固定为 ODS→DWD→DWS→INDEX;同层任务无隐含互相依赖\n+ # prompt: \"修复管理后台全选任务时不按层级顺序执行的问题\"\n+ # 2. 隐含层级依赖:按层分组,相邻层之间建立边\n+ # 选择每层一个\"代表节点\"作为屏障,避免 O(n*m) 的全连接边\n+ layer_groups: dict[int, list[str]] = {}\n+ for code in task_codes:\n+ meta = registry.get_metadata(code)\n+ if meta and meta.layer:\n+ order = _LAYER_ORDER.get(meta.layer.upper())\n+ if order is not None:\n+ layer_groups.setdefault(order, []).append(code)\n+\n+ sorted_layers = sorted(layer_groups.keys())\n+ for i in range(len(sorted_layers) - 1):\n+ lower_layer = sorted_layers[i]\n+ higher_layer = sorted_layers[i + 1]\n+ # 高层的每个任务都依赖低层的所有任务\n+ for higher_code in layer_groups[higher_layer]:\n+ for lower_code in layer_groups[lower_layer]:\n+ # 避免重复添加已有的显式依赖边\n+ if higher_code not in graph[lower_code]:\n+ graph[lower_code].append(higher_code)\n+ in_degree[higher_code] += 1\n+\n queue = deque(code for code in task_codes if in_degree[code] == 0)\n result = []\n while queue:\ndiff --git a/apps/etl/connectors/feiqiu/quality/consistency_checker.py b/apps/etl/connectors/feiqiu/quality/consistency_checker.py\nindex dc2b075..1bfdf2a 100644\n--- a/apps/etl/connectors/feiqiu/quality/consistency_checker.py\n+++ b/apps/etl/connectors/feiqiu/quality/consistency_checker.py\n@@ -606,6 +606,11 @@ def run_consistency_check(\n report.ods_vs_dwd_results.append(result)\n \n except Exception as exc:\n+ # CHANGE 2026-02-24 | rollback 防止 InFailedSqlTransaction 级联到后续表检查\n+ try:\n+ db_conn.conn.rollback()\n+ except Exception:\n+ pass\n result = TableCheckResult(\n table_name=dwd_full,\n check_type=\"ods_vs_dwd\",\ndiff --git a/apps/etl/connectors/feiqiu/tasks/dwd/dwd_load_task.py b/apps/etl/connectors/feiqiu/tasks/dwd/dwd_load_task.py\nindex 66285ae..d2b6c27 100644\n--- a/apps/etl/connectors/feiqiu/tasks/dwd/dwd_load_task.py\n+++ b/apps/etl/connectors/feiqiu/tasks/dwd/dwd_load_task.py\n@@ -269,6 +269,9 @@ class DwdLoadTask(BaseTask):\n (\"days_on_shelf\", \"days_available\", None),\n (\"sort_order\", \"sort\", None),\n (\"time_slot_sale\", \"time_slot_sale\", None), # CHANGE 2026-02-21: 新增分时段销售标记\n+ (\"warning_sales_day\", \"warning_sales_day\", None), # CHANGE 2026-02-24: 库存预警日均销量\n+ (\"warning_day_max\", \"warning_day_max\", None), # CHANGE 2026-02-24: 预警天数上限\n+ (\"warning_day_min\", \"warning_day_min\", None), # CHANGE 2026-02-24: 预警天数下限\n ],\n \"dwd.dim_goods_category\": [\n (\"category_id\", \"id\", None),\ndiff --git a/apps/etl/connectors/feiqiu/tasks/dws/__init__.py b/apps/etl/connectors/feiqiu/tasks/dws/__init__.py\nindex a585feb..f53b0e8 100644\n--- a/apps/etl/connectors/feiqiu/tasks/dws/__init__.py\n+++ b/apps/etl/connectors/feiqiu/tasks/dws/__init__.py\n@@ -13,6 +13,7 @@ DWS层ETL任务模块\n \n from .base_dws_task import BaseDwsTask, TimeLayer, TimeWindow, CourseType, DiscountType\n from .assistant_daily_task import AssistantDailyTask\n+from .assistant_order_contribution_task import AssistantOrderContributionTask\n from .assistant_monthly_task import AssistantMonthlyTask\n from .assistant_customer_task import AssistantCustomerTask\n from .assistant_salary_task import AssistantSalaryTask\n@@ -47,6 +48,7 @@ __all__ = [\n \"DiscountType\",\n # 助教维度\n \"AssistantDailyTask\",\n+ \"AssistantOrderContributionTask\",\n \"AssistantMonthlyTask\",\n \"AssistantCustomerTask\",\n \"AssistantSalaryTask\",\ndiff --git a/apps/etl/connectors/feiqiu/tasks/dws/assistant_daily_task.py b/apps/etl/connectors/feiqiu/tasks/dws/assistant_daily_task.py\nindex 4ddbeed..b6c0e0a 100644\n--- a/apps/etl/connectors/feiqiu/tasks/dws/assistant_daily_task.py\n+++ b/apps/etl/connectors/feiqiu/tasks/dws/assistant_daily_task.py\n@@ -29,12 +29,19 @@\n \n from __future__ import annotations\n \n-from datetime import date, datetime, timedelta\n-from decimal import Decimal\n+from collections import defaultdict\n+from datetime import date, datetime, time, timedelta\n+from decimal import Decimal, ROUND_HALF_UP\n from typing import Any, Dict, List, Optional, Set, Tuple\n \n from .base_dws_task import BaseDwsTask, CourseType, TaskContext\n \n+# 惩罚区域集合:大厅 A/B/C/S/TV + 麻将房 M1–M7\n+PENALTY_AREAS: Set[str] = {\n+ \"A\", \"B\", \"C\", \"S\", \"TV\",\n+ \"M1\", \"M2\", \"M3\", \"M4\", \"M5\", \"M6\", \"M7\",\n+}\n+\n \n class AssistantDailyTask(BaseDwsTask):\n \"\"\"\n@@ -93,7 +100,7 @@ class AssistantDailyTask(BaseDwsTask):\n \n def transform(self, extracted: Dict[str, Any], context: TaskContext) -> List[Dict[str, Any]]:\n \"\"\"\n- 转换数据:按助教+日期聚合\n+ 转换数据:按助教+日期聚合,并执行定档折算惩罚检测\n \"\"\"\n service_records = extracted['service_records']\n site_id = extracted['site_id']\n@@ -108,6 +115,68 @@ class AssistantDailyTask(BaseDwsTask):\n service_records, \n site_id\n )\n+\n+ # ── 定档折算惩罚检测 ──\n+ # 构造重叠检测所需的记录格式\n+ overlap_records = []\n+ for r in service_records:\n+ start_t = r.get(\"start_use_time\")\n+ end_t = r.get(\"last_use_time\")\n+ if start_t is None or end_t is None:\n+ continue\n+ overlap_records.append({\n+ \"assistant_id\": r.get(\"assistant_id\"),\n+ \"table_id\": r.get(\"table_id\"),\n+ \"table_area\": r.get(\"table_area_name\", \"\"),\n+ \"start_time\": start_t,\n+ \"end_time\": end_t,\n+ \"service_date\": r.get(\"service_date\"),\n+ })\n+\n+ violations = self.detect_overlap_violations(overlap_records, PENALTY_AREAS)\n+\n+ # 将惩罚信息填充到聚合结果\n+ for agg in aggregated:\n+ aid = agg[\"assistant_id\"]\n+ stat_date = agg[\"stat_date\"]\n+ key = (aid, stat_date)\n+\n+ if agg.get(\"is_exempt\"):\n+ # 豁免:不计算惩罚\n+ agg[\"penalty_minutes\"] = Decimal(\"0\")\n+ agg[\"penalty_reason\"] = None\n+ agg[\"is_exempt\"] = True\n+ agg[\"per_hour_contribution\"] = None\n+ elif key in violations:\n+ # 有违规:计算惩罚\n+ # 取第一条违规信息(同一天可能有多条,取最严重的)\n+ v_list = violations[key]\n+ overlap_count = max(v[\"overlap_count\"] for v in v_list)\n+ # per_hour_contribution 需要从台费数据计算\n+ # 此处使用聚合后的 base_ledger_amount 和 base_hours 近似\n+ base_hours = agg.get(\"base_hours\", Decimal(\"0\"))\n+ base_amount = agg.get(\"base_ledger_amount\", Decimal(\"0\"))\n+ if base_hours > 0:\n+ per_hour = base_amount / base_hours / Decimal(str(overlap_count))\n+ else:\n+ per_hour = Decimal(\"0\")\n+\n+ actual_minutes = agg.get(\"base_hours\", Decimal(\"0\")) * Decimal(\"60\")\n+ penalty = self.compute_penalty_minutes(actual_minutes, per_hour)\n+\n+ agg[\"penalty_minutes\"] = penalty\n+ agg[\"penalty_reason\"] = (\n+ f\"规则2违规:同台桌{overlap_count}名助教重叠挂台,\"\n+ f\"单人每小时贡献={per_hour:.2f}元\"\n+ )\n+ agg[\"is_exempt\"] = False\n+ agg[\"per_hour_contribution\"] = per_hour\n+ else:\n+ # 无违规\n+ agg[\"penalty_minutes\"] = Decimal(\"0\")\n+ agg[\"penalty_reason\"] = None\n+ agg[\"is_exempt\"] = False\n+ agg[\"per_hour_contribution\"] = None\n \n return aggregated\n \n@@ -143,6 +212,9 @@ class AssistantDailyTask(BaseDwsTask):\n asl.real_use_seconds,\n asl.ledger_amount,\n asl.ledger_unit_price,\n+ asl.start_use_time,\n+ asl.last_use_time,\n+ asl.table_area_name,\n DATE(asl.start_use_time) AS service_date,\n COALESCE(ex.is_trash, 0) AS is_trash\n FROM dwd.dwd_assistant_service_log asl\n@@ -281,6 +353,131 @@ class AssistantDailyTask(BaseDwsTask):\n \n return result\n \n+ # ==========================================================================\n+ # 定档折算惩罚 — 纯函数(静态方法,不依赖数据库)\n+ # ==========================================================================\n+\n+ @staticmethod\n+ def detect_overlap_violations(\n+ service_records: List[Dict[str, Any]],\n+ penalty_areas: Set[str],\n+ ) -> Dict[Tuple[int, date], List[Dict[str, Any]]]:\n+ \"\"\"\n+ 检测同一台桌同一时间段超过 2 名助教挂台的违规。\n+\n+ 输入:\n+ service_records: 服务记录\n[TRUNCATED: apps/etl/connectors/feiqiu/tasks/dws/assistant_daily_task.py diff too long]\ndiff --git a/apps/etl/connectors/feiqiu/tasks/dws/member_consumption_task.py b/apps/etl/connectors/feiqiu/tasks/dws/member_consumption_task.py\nindex a7bd10e..e618dbb 100644\n--- a/apps/etl/connectors/feiqiu/tasks/dws/member_consumption_task.py\n+++ b/apps/etl/connectors/feiqiu/tasks/dws/member_consumption_task.py\n@@ -85,10 +85,14 @@ class MemberConsumptionTask(BaseDwsTask):\n # 3. 获取会员卡余额\n card_balances = self._extract_card_balances(site_id)\n \n+ # CHANGE 2025-07-15 | task 4.1: 获取充值统计(30/60/90 天窗口)\n+ recharge_stats = self._extract_recharge_stats(site_id, stat_date)\n+ \n return {\n 'consumption_stats': consumption_stats,\n 'member_info': member_info,\n 'card_balances': card_balances,\n+ 'recharge_stats': recharge_stats,\n 'stat_date': stat_date,\n 'site_id': site_id\n }\n@@ -100,6 +104,7 @@ class MemberConsumptionTask(BaseDwsTask):\n consumption_stats = extracted['consumption_stats']\n member_info = extracted['member_info']\n card_balances = extracted['card_balances']\n+ recharge_stats = extracted.get('recharge_stats', {})\n stat_date = extracted['stat_date']\n site_id = extracted['site_id']\n \n@@ -119,11 +124,20 @@ class MemberConsumptionTask(BaseDwsTask):\n \n memb_info = member_info.get(member_id, {})\n balance = card_balances.get(member_id, {})\n+ # CHANGE 2025-07-15 | task 4.2: 合并充值统计,无记录时默认 0\n+ recharge = recharge_stats.get(member_id, {})\n \n # 计算活跃度和客户分层\n days_since_last = self._calc_days_since(stat_date, stats.get('last_consume_date'))\n customer_tier = self._calculate_customer_tier(stats, days_since_last)\n \n+ # CHANGE 2025-07-15 | task 4.2: 次均消费 = total_consume_amount / MAX(total_visit_count, 1)\n+ total_consume_amount = self.safe_decimal(stats.get('total_consume_amount', 0))\n+ total_visit_count = self.safe_int(stats.get('total_visit_count', 0))\n+ avg_ticket_amount = (\n+ total_consume_amount / max(total_visit_count, 1)\n+ ).quantize(Decimal('0.01'))\n+ \n record = {\n 'site_id': site_id,\n 'tenant_id': self.config.get(\"app.tenant_id\", site_id),\n@@ -137,8 +151,8 @@ class MemberConsumptionTask(BaseDwsTask):\n # 全量累计统计\n 'first_consume_date': stats.get('first_consume_date'),\n 'last_consume_date': stats.get('last_consume_date'),\n- 'total_visit_count': self.safe_int(stats.get('total_visit_count', 0)),\n- 'total_consume_amount': self.safe_decimal(stats.get('total_consume_amount', 0)),\n+ 'total_visit_count': total_visit_count,\n+ 'total_consume_amount': total_consume_amount,\n 'total_recharge_amount': self.safe_decimal(memb_info.get('recharge_money_sum', 0)),\n 'total_table_fee': self.safe_decimal(stats.get('total_table_fee', 0)),\n 'total_goods_amount': self.safe_decimal(stats.get('total_goods_amount', 0)),\n@@ -156,6 +170,15 @@ class MemberConsumptionTask(BaseDwsTask):\n 'consume_amount_30d': self.safe_decimal(stats.get('consume_amount_30d', 0)),\n 'consume_amount_60d': self.safe_decimal(stats.get('consume_amount_60d', 0)),\n 'consume_amount_90d': self.safe_decimal(stats.get('consume_amount_90d', 0)),\n+ # 充值窗口统计(30/60/90 天)\n+ 'recharge_count_30d': self.safe_int(recharge.get('count_30d', 0)),\n+ 'recharge_count_60d': self.safe_int(recharge.get('count_60d', 0)),\n+ 'recharge_count_90d': self.safe_int(recharge.get('count_90d', 0)),\n+ 'recharge_amount_30d': self.safe_decimal(recharge.get('amount_30d', 0)),\n+ 'recharge_amount_60d': self.safe_decimal(recharge.get('amount_60d', 0)),\n+ 'recharge_amount_90d': self.safe_decimal(recharge.get('amount_90d', 0)),\n+ # 次均消费\n+ 'avg_ticket_amount': avg_ticket_amount,\n # 卡余额\n 'cash_card_balance': self.safe_decimal(balance.get('cash_balance', 0)),\n 'gift_card_balance': self.safe_decimal(balance.get('gift_balance', 0)),\n@@ -259,13 +282,14 @@ class MemberConsumptionTask(BaseDwsTask):\n ) AS birthday\n FROM dwd.dim_member m\n WHERE m.member_id IN (\n- SELECT DISTINCT tenant_member_id\n+ SELECT DISTINCT member_id\n FROM dwd.dwd_settlement_head\n WHERE site_id = %s\n- AND tenant_member_id IS NOT NULL\n- AND tenant_member_id != 0\n+ AND member_id IS NOT NULL\n+ AND member_id != 0\n ) AND m.scd2_is_current = 1\n \"\"\"\n+ # CHANGE 2026-02-24 | 修复列\n[TRUNCATED: apps/etl/connectors/feiqiu/tasks/dws/member_consumption_task.py diff too long]\n\n[TRUNCATED: diff exceeds 30KB]", - "latest_prompt_log": "- [P20260226-074515] 2026-02-26 07:45:15 +0800\n - summary: - 调查说明现在HOOKS是如何判断文件的修改?- 对这3个HOOKS的详细工作说明,处理流程,输出一个.md文档。\n - prompt:\n```text\n- 调查说明现在HOOKS是如何判断文件的修改?- 对这3个HOOKS的详细工作说明,处理流程,输出一个.md文档。\n```\n" -} \ No newline at end of file diff --git a/.kiro/.compliance_state.json b/.kiro/.compliance_state.json deleted file mode 100644 index 740d637..0000000 --- a/.kiro/.compliance_state.json +++ /dev/null @@ -1,74 +0,0 @@ -{ - "needs_check": true, - "scanned_at": "2026-02-26T08:03:23.664569+08:00", - "new_migration_sql": [ - "db/etl_feiqiu/migrations/2025-02-24__alter_assistant_daily_add_penalty_fields.sql", - "db/etl_feiqiu/migrations/2025-02-24__alter_member_consumption_add_recharge_fields.sql", - "db/etl_feiqiu/migrations/2025-02-24__create_dws_assistant_order_contribution.sql", - "db/etl_feiqiu/migrations/2025-02-24__create_rls_view_assistant_order_contribution.sql", - "db/etl_feiqiu/migrations/2026-02-24__add_goods_stock_warning_info.sql", - "db/etl_feiqiu/migrations/2026-02-24__cleanup_assistant_abolish_residual.sql", - "db/etl_feiqiu/migrations/2026-02-24__p1_create_app_schema_rls_views.sql", - "db/zqyy_app/migrations/2026-02-24__p1_create_auth_biz_schemas.sql", - "db/zqyy_app/migrations/2026-02-24__p1_setup_fdw_etl.sql", - "db/zqyy_app/migrations/2026-02-25__p3_create_auth_tables.sql", - "db/zqyy_app/migrations/2026-02-25__p3_seed_roles_permissions.sql" - ], - "new_or_modified_sql": [ - "db/_archived/ddl_baseline_2026-02-22/db/etl_feiqiu/schemas/dwd.sql", - "db/_archived/ddl_baseline_2026-02-22/db/etl_feiqiu/schemas/ods.sql", - "db/_archived/ddl_baseline_2026-02-22/db/zqyy_app/migrations/2025-02-24__add_fdw_dws_extensions.sql", - "db/etl_feiqiu/migrations/2025-02-24__alter_assistant_daily_add_penalty_fields.sql", - "db/etl_feiqiu/migrations/2025-02-24__alter_member_consumption_add_recharge_fields.sql", - "db/etl_feiqiu/migrations/2025-02-24__create_dws_assistant_order_contribution.sql", - "db/etl_feiqiu/migrations/2025-02-24__create_rls_view_assistant_order_contribution.sql", - "db/etl_feiqiu/migrations/2026-02-24__add_goods_stock_warning_info.sql", - "db/etl_feiqiu/migrations/2026-02-24__cleanup_assistant_abolish_residual.sql", - "db/etl_feiqiu/migrations/2026-02-24__p1_create_app_schema_rls_views.sql", - "db/etl_feiqiu/seeds/seed_ods_tasks.sql", - "db/etl_feiqiu/seeds/seed_scheduler_tasks.sql", - "db/zqyy_app/migrations/2026-02-24__p1_create_auth_biz_schemas.sql", - "db/zqyy_app/migrations/2026-02-24__p1_setup_fdw_etl.sql", - "db/zqyy_app/migrations/2026-02-25__p3_create_auth_tables.sql", - "db/zqyy_app/migrations/2026-02-25__p3_seed_roles_permissions.sql", - "docs/database/ddl/etl_feiqiu__app.sql", - "docs/database/ddl/etl_feiqiu__core.sql", - "docs/database/ddl/etl_feiqiu__dwd.sql", - "docs/database/ddl/etl_feiqiu__dws.sql", - "docs/database/ddl/etl_feiqiu__meta.sql", - "docs/database/ddl/etl_feiqiu__ods.sql", - "docs/database/ddl/fdw.sql", - "docs/database/ddl/zqyy_app__auth.sql", - "docs/database/ddl/zqyy_app__public.sql" - ], - "code_without_docs": [ - { - "file": "apps/etl/connectors/feiqiu/orchestration/flow_runner.py", - "expected_docs": [ - "apps/etl/connectors/feiqiu/docs/architecture/" - ] - }, - { - "file": "apps/etl/connectors/feiqiu/orchestration/task_executor.py", - "expected_docs": [ - "apps/etl/connectors/feiqiu/docs/architecture/" - ] - }, - { - "file": "apps/etl/connectors/feiqiu/orchestration/task_registry.py", - "expected_docs": [ - "apps/etl/connectors/feiqiu/docs/architecture/" - ] - }, - { - "file": "apps/etl/connectors/feiqiu/orchestration/topological_sort.py", - "expected_docs": [ - "apps/etl/connectors/feiqiu/docs/architecture/" - ] - } - ], - "new_files": [], - "has_bd_manual": true, - "has_audit_record": false, - "has_ddl_baseline": true -} \ No newline at end of file diff --git a/.kiro/.git_snapshot.json b/.kiro/.git_snapshot.json deleted file mode 100644 index b7e13eb..0000000 --- a/.kiro/.git_snapshot.json +++ /dev/null @@ -1,106 +0,0 @@ -{ - "files": [ - ".gitignore", - ".kiro/.audit_context.json", - ".kiro/.compliance_state.json", - ".kiro/.git_snapshot.json", - ".kiro/.last_prompt_id.json", - ".kiro/agents/audit-writer.md", - ".kiro/hooks/agent-on-stop.kiro.hook", - ".kiro/hooks/audit-flagger.kiro.hook", - ".kiro/hooks/audit-reminder.kiro.hook", - ".kiro/hooks/change-compliance.kiro.hook", - ".kiro/hooks/prompt-audit-log.kiro.hook", - ".kiro/hooks/prompt-on-submit.kiro.hook", - ".kiro/hooks/run-audit-writer.kiro.hook", - ".kiro/hooks/session-log.kiro.hook", - ".kiro/scripts/agent_on_stop.py", - ".kiro/scripts/build_audit_context.py", - ".kiro/scripts/change_compliance_prescan.py", - ".kiro/scripts/prompt_on_submit.py", - ".kiro/scripts/session_log.py", - ".kiro/specs/01-miniapp-db-foundation/.config.kiro", - ".kiro/specs/01-miniapp-db-foundation/design.md", - ".kiro/specs/01-miniapp-db-foundation/requirements.md", - ".kiro/specs/01-miniapp-db-foundation/tasks.md", - ".kiro/specs/02-etl-dws-miniapp-extensions/.config.kiro", - ".kiro/specs/02-etl-dws-miniapp-extensions/design.md", - ".kiro/specs/02-etl-dws-miniapp-extensions/requirements.md", - ".kiro/specs/02-etl-dws-miniapp-extensions/tasks.md", - ".kiro/specs/03-miniapp-auth-system/.config.kiro", - ".kiro/specs/03-miniapp-auth-system/design.md", - ".kiro/specs/03-miniapp-auth-system/requirements.md", - ".kiro/specs/03-miniapp-auth-system/tasks.md", - ".kiro/specs/[ETL]-fullstack-integration/.config.kiro", - ".kiro/specs/[ETL]-fullstack-integration/design.md", - ".kiro/specs/[ETL]-fullstack-integration/requirements.md", - ".kiro/specs/[ETL]-fullstack-integration/tasks.md", - ".kiro/specs/etl-fullstack-integration/design.md", - ".kiro/specs/etl-fullstack-integration/tasks.md", - ".kiro/specs/miniapp-core-business/.config.kiro", - ".kiro/specs/miniapp-core-business/requirements.md", - ".kiro/specs/spi-spending-power-index/tasks.md", - ".kiro/steering/doc-map.md", - "README.md", - "apps/admin-web/README.md", - "apps/backend/app/auth/dependencies.py", - "apps/backend/app/auth/jwt.py", - "apps/backend/app/main.py", - "apps/backend/app/routers/xcx_auth.py", - "apps/backend/app/schemas/xcx_auth.py", - "apps/backend/app/services/application.py", - "apps/backend/app/services/matching.py", - "apps/backend/app/services/role.py", - "apps/backend/app/services/task_registry.py", - "apps/backend/app/services/wechat.py", - "apps/backend/auth_only.txt", - "apps/backend/auth_only_results.txt", - "apps/backend/auth_test_results.txt", - "apps/backend/docs/API-REFERENCE.md", - "apps/backend/test_results.txt", - "apps/etl/connectors/feiqiu/docs/CHANGELOG.md", - "apps/etl/connectors/feiqiu/docs/README.md", - "apps/etl/connectors/feiqiu/docs/business-rules/dws_metrics.md", - "apps/etl/connectors/feiqiu/docs/business-rules/scd2_rules.md", - "apps/etl/connectors/feiqiu/docs/etl_tasks/README.md", - "apps/etl/connectors/feiqiu/docs/etl_tasks/base_task_mechanism.md", - "apps/etl/connectors/feiqiu/docs/etl_tasks/dws_tasks.md", - "apps/etl/connectors/feiqiu/docs/etl_tasks/index_tasks.md", - "apps/etl/connectors/feiqiu/docs/etl_tasks/ods_tasks.md", - "apps/etl/connectors/feiqiu/docs/operations/environment_setup.md", - "apps/etl/connectors/feiqiu/docs/operations/troubleshooting.md", - "apps/etl/connectors/feiqiu/orchestration/flow_runner.py", - "apps/etl/connectors/feiqiu/orchestration/task_executor.py", - "apps/etl/connectors/feiqiu/orchestration/task_registry.py", - "apps/etl/connectors/feiqiu/orchestration/topological_sort.py", - "apps/etl/connectors/feiqiu/quality/consistency_checker.py", - "apps/etl/connectors/feiqiu/scripts/verify_dws_extensions.py", - "apps/etl/connectors/feiqiu/tasks/dwd/dwd_load_task.py", - "apps/etl/connectors/feiqiu/tasks/dws/__init__.py", - "apps/etl/connectors/feiqiu/tasks/dws/assistant_daily_task.py", - "apps/etl/connectors/feiqiu/tasks/dws/assistant_order_contribution_task.py", - "apps/etl/connectors/feiqiu/tasks/dws/member_consumption_task.py", - "apps/etl/connectors/feiqiu/tasks/dws/member_visit_task.py", - "apps/etl/connectors/feiqiu/tasks/ods/ods_tasks.py", - "apps/etl/connectors/feiqiu/tests/unit/test_topological_sort.py", - "apps/miniprogram/README.md", - "db/README.md", - "db/_archived/ddl_baseline_2026-02-22/db/etl_feiqiu/schemas/dwd.sql", - "db/_archived/ddl_baseline_2026-02-22/db/etl_feiqiu/schemas/ods.sql", - "db/_archived/ddl_baseline_2026-02-22/db/zqyy_app/migrations/2025-02-24__add_fdw_dws_extensions.sql", - "db/etl_feiqiu/migrations/2025-02-24__alter_assistant_daily_add_penalty_fields.sql", - "db/etl_feiqiu/migrations/2025-02-24__alter_member_consumption_add_recharge_fields.sql", - "db/etl_feiqiu/migrations/2025-02-24__create_dws_assistant_order_contribution.sql", - "db/etl_feiqiu/migrations/2025-02-24__create_rls_view_assistant_order_contribution.sql", - "db/etl_feiqiu/migrations/2026-02-24__add_goods_stock_warning_info.sql", - "db/etl_feiqiu/migrations/2026-02-24__cleanup_assistant_abolish_residual.sql", - "db/etl_feiqiu/migrations/2026-02-24__p1_create_app_schema_rls_views.sql", - "db/etl_feiqiu/seeds/seed_ods_tasks.sql", - "db/etl_feiqiu/seeds/seed_scheduler_tasks.sql", - "db/zqyy_app/README.md", - "db/zqyy_app/migrations/2026-02-24__p1_create_auth_biz_schemas.sql", - "db/zqyy_app/migrations/2026-02-24__p1_setup_fdw_etl.sql" - ], - "fingerprint": "96d0946e775eac6698780fe8290e7e73d762b201", - "taken_at": "2026-02-26T08:03:18.159857+08:00" -} \ No newline at end of file diff --git a/.kiro/.last_prompt_id.json b/.kiro/.last_prompt_id.json deleted file mode 100644 index 486abdf..0000000 --- a/.kiro/.last_prompt_id.json +++ /dev/null @@ -1,4 +0,0 @@ -{ - "prompt_id": "P20260226-080318", - "at": "2026-02-26T08:03:18.159857+08:00" -} \ No newline at end of file diff --git a/.kiro/agents/audit-writer.md b/.kiro/agents/audit-writer.md index 8937b6a..1e0d0fd 100644 --- a/.kiro/agents/audit-writer.md +++ b/.kiro/agents/audit-writer.md @@ -8,7 +8,7 @@ tools: ["read", "write", "shell"] ## 核心原则:从预构建上下文工作,禁止全盘扫描 -你的唯一输入是 `.kiro/.audit_context.json`(由 `build_audit_context.py` 预构建)。 +你的唯一输入是 `.kiro/state/.audit_context.json`(由 `build_audit_context.py` 预构建)。 该文件已包含所有你需要的信息: | 字段 | 来源 | 内容 | @@ -22,7 +22,9 @@ tools: ["read", "write", "shell"] | `compliance.new_migration_sql` | compliance-prescan | 新增迁移 SQL 列表 | | `compliance.has_bd_manual` | compliance-prescan | 是否已有 BD_Manual 文档 | | `compliance.has_ddl_baseline` | compliance-prescan | 是否已更新 DDL 基线 | -| `external_files` | agent-on-stop | 非 Kiro 操作产生的变更文件(CLI/脚本/手动编辑) | +| `compliance.api_changed` | compliance-prescan | 是否有接口相关文件变更 | +| `compliance.openapi_spec_stale` | compliance-prescan | OpenAPI spec 是否需要重新导出 | +| `session_diff` | agent-on-stop (file baseline) | 本次对话期间的精确变更:`added`/`modified`/`deleted` | | `prompt_id` / `latest_prompt_log` | prompt-audit-log | Prompt-ID 与原文(溯源用) | **禁止操作**: @@ -51,18 +53,60 @@ tools: ["read", "write", "shell"] ## 执行策略(从 context 驱动,不做冗余扫描) ### 步骤 1:读取上下文 -读取 `.kiro/.audit_context.json`,提取关键字段。 +读取 `.kiro/state/.audit_context.json`,提取关键字段。 + +### 步骤 1b:读取 Session 索引 +读取 `docs/audit/session_logs/_session_index.json`,按 `startTime` 找到与 `audit_context.json` 中 `prompt_at` 最接近的 entry(非 `is_sub` 的主对话)。提取: +- `description`:作为审计记录的「操作摘要」(比从 diff 推断更准确、更完整) +- `summary.files_modified` / `summary.files_created`:交叉验证 `session_diff` +- executionId 前 8 位:作为 `session_id` 写入审计记录,建立双向链接 +- `summary.sub_agents`:记录本次对话调用了哪些子代理 +- `summary.errors`:标注执行中的异常 + +若索引不存在或无匹配 entry,跳过此步骤,不影响后续流程。 ### 步骤 2:审计落盘(按需调用 skill) 根据 `reasons` 判断需要哪些 skill: - 含 `dir:backend` / `dir:etl` / `dir:shared` 等 → 调用 `steering-readme-maintainer` - 含任意高风险标签 → 调用 `change-annotation-audit`(写 docs/audit/changes/ + AI_CHANGELOG + CHANGE 注释) -- 含 `db-schema-change` → 调用 `bd-manual-db-docs` +- 含 `db-schema-change` → 调用 `bd-manual-db-docs`,并执行 DB 文档全量对账(见步骤 2b) -若 `external_files` 非空,在审计记录(`docs/audit/changes/` 文件)中增加「外部变更」段落: -- 列出所有外部变更文件路径 -- 标注来源为"非 Kiro 操作(CLI/脚本/手动编辑)" -- 若外部变更涉及高风险路径,额外标注 ⚠️ +所有审计记录中涉及日期时间的字段,必须精确到秒(格式:`YYYY-MM-DD HH:MM:SS`,时区 Asia/Shanghai)。包括但不限于:审计记录头部的"日期"、AI_CHANGELOG 条目的时间戳、CHANGE 标记注释中的日期。 + +若 `session_diff` 中有 `added` 或 `deleted` 文件,在审计记录中增加「本次对话文件变更」段落,分别列出新增和删除的文件。 + +若步骤 1b 成功获取了 Session 信息,在审计记录头部元数据中增加: +- `session_id`:executionId 前 8 位(如 `f29acdea`) +- `操作摘要`:Session 索引中的 `description`(LLM 生成的操作摘要) +- `session_path`:Session 日志文件的相对路径(`output_dir` 字段值) + +审计记录头部模板: +```markdown +# 变更审计记录:<标题> + +| 字段 | 值 | +|------|-----| +| 日期 | YYYY-MM-DD HH:MM:SS | +| Prompt-ID | <从 audit_context> | +| Session-ID | | +| Session 路径 | | + +## 操作摘要 + +``` + +### 步骤 2b:DB 文档全量对账(当 reasons 含 db-schema-change 时) +当 `reasons` 含 `db-schema-change` 时,除了调用 `bd-manual-db-docs` skill 处理本次变更外,还必须执行全量对账: + +1. 连接测试库(使用 pg power 的 `pg-etl-test` / `pg-app-test`),查询 `information_schema.tables` 和 `information_schema.columns` 获取所有表和字段的实际结构 +2. 扫描 `docs/database/` 下现有文档,逐表对比: + - 文档中缺失的表 → 新建表结构文档 + - 文档中字段与实际不一致(类型、nullable、默认值等)→ 更新文档 + - 文档中存在但数据库已删除的表 → 在文档中标注已废弃 +3. 输出对账摘要到审计记录中,列出:新增文档数、更新文档数、废弃标注数 +4. 所有文档输出到 `docs/database/`,遵循现有目录结构和模板格式 + +注意:全量对账使用测试库(TEST_DB_DSN),禁止连接正式库。 ### 步骤 3:文档校对补齐 遍历 `compliance.code_without_docs`,对每个缺失项: @@ -71,9 +115,10 @@ tools: ["read", "write", "shell"] | 代码路径前缀 | 应同步更新的文档 | |---|---| -| `apps/backend/app/routers/` | `apps/backend/docs/API-REFERENCE.md` | +| `apps/backend/app/routers/` | `apps/backend/docs/API-REFERENCE.md` + `docs/contracts/openapi/backend-api.json` | | `apps/backend/app/services/` | `apps/backend/docs/API-REFERENCE.md` + `apps/backend/README.md` | -| `apps/backend/app/auth/` | `apps/backend/docs/API-REFERENCE.md` + `apps/backend/README.md` | +| `apps/backend/app/auth/` | `apps/backend/docs/API-REFERENCE.md` + `apps/backend/README.md` + `docs/contracts/openapi/backend-api.json` | +| `apps/backend/app/schemas/` | `docs/contracts/openapi/backend-api.json` | | `apps/etl/connectors/feiqiu/tasks/` | `apps/etl/connectors/feiqiu/docs/etl_tasks/` | | `apps/etl/connectors/feiqiu/loaders/` | `apps/etl/connectors/feiqiu/docs/etl_tasks/` | | `apps/etl/connectors/feiqiu/scd/` | `apps/etl/connectors/feiqiu/docs/business-rules/scd2_rules.md` | @@ -90,8 +135,48 @@ tools: ["read", "write", "shell"] - 若 `compliance.new_migration_sql` 非空且 `compliance.has_ddl_baseline` 为 false: - 在审计记录中标注 ⚠️ DDL 基线待合并 -### 步骤 5:收尾 -- 把 `.kiro/.audit_state.json` 的 `audit_required` 置为 false,清空 `reasons`/`changed_files`/`last_reminded_at` +### 步骤 4b:OpenAPI Spec 同步检查 +- 若 `compliance.api_changed` 为 true 且 `compliance.openapi_spec_stale` 为 true: + - 在审计记录中标注 ⚠️ 接口代码已变更但 OpenAPI spec 未同步 + - 运行 `python scripts/ops/_export_openapi.py` 重新导出 spec(需后端可导入) + - 若导出失败(后端未启动等),在审计记录中标注待手动导出 + - 导出成功后提醒用户重连 OpenAPI Power 的 MCP server 以加载新 spec +- 若 `compliance.api_changed` 为 true 且 `compliance.openapi_spec_stale` 为 false: + - spec 已同步更新,无需额外操作 + +### 步骤 5:改动注解(Change Annotations) + +对本次审计涉及的所有变更文件,在审计记录(`docs/audit/changes/__.md`)中生成逐文件的改动注解段落。 + +注解内容包括: +- 文件路径 +- 变更类型(新增 / 修改 / 删除) +- 原始原因:为什么要做这个改动(从 `latest_prompt_log` 和 diff 上下文推断用户意图) +- 思路分析:改动的技术思路和设计决策(从 diff 内容和代码结构推断) +- 修改结果:改动后的效果和影响范围 + +格式模板(写入审计记录的 `## 改动注解` 段落): + +```markdown +## 改动注解 + +### `<文件路径>` +- 变更类型:新增 / 修改 / 删除 +- 原始原因:<从 prompt log 和 diff 推断的改动动机> +- 思路分析:<技术思路、设计决策、为什么选择这种实现方式> +- 修改结果:<改动后的效果、影响范围、与其他模块的关联> +``` + +执行规则: +- 只对 `high_risk_files` 和 `session_diff.added` 中的文件写详细注解 +- 对非高风险的 `session_diff.modified` 文件写简要一行注解即可 +- 对 `session_diff.deleted` 文件只记录删除原因 +- 注解内容从 `high_risk_diff`、`latest_prompt_log`、文件内容综合推断,不要编造 +- 若某文件的 diff 被截断,可对该单个文件运行 `git diff HEAD -- ` 获取完整 diff +- 注解语言使用简体中文 + +### 步骤 6:收尾 +- 把 `.kiro/state/.audit_state.json` 的 `audit_required` 置为 false,清空 `reasons`/`changed_files`/`last_reminded_at` - 执行 `python scripts/audit/gen_audit_dashboard.py` 刷新审计一览表 ## 输出(强制极短回执) diff --git a/.kiro/hooks/agent-on-stop.kiro.hook b/.kiro/hooks/agent-on-stop.kiro.hook index 6745a18..eb7435f 100644 --- a/.kiro/hooks/agent-on-stop.kiro.hook +++ b/.kiro/hooks/agent-on-stop.kiro.hook @@ -8,7 +8,8 @@ }, "then": { "type": "runCommand", - "command": "python .kiro/scripts/agent_on_stop.py" + "command": "python C:/NeoZQYY/.kiro/scripts/agent_on_stop.py", + "timeout": 360 }, "workspaceFolderName": "NeoZQYY", "shortName": "agent-on-stop" diff --git a/.kiro/hooks/audit-flagger.kiro.hook b/.kiro/hooks/audit-flagger.kiro.hook deleted file mode 100644 index af37ffc..0000000 --- a/.kiro/hooks/audit-flagger.kiro.hook +++ /dev/null @@ -1,15 +0,0 @@ -{ - "enabled": false, - "name": "Audit Flagger (Prompt Submit)", - "description": "每次提交 prompt 时,基于 git status 判断是否存在高风险改动;若需要审计则写入 .kiro/.audit_state.json(无 stdout)。", - "version": "1", - "when": { - "type": "promptSubmit" - }, - "then": { - "type": "runCommand", - "command": "python .kiro/scripts/audit_flagger.py" - }, - "workspaceFolderName": "NeoZQYY", - "shortName": "audit-flagger" -} \ No newline at end of file diff --git a/.kiro/hooks/audit-reminder.kiro.hook b/.kiro/hooks/audit-reminder.kiro.hook deleted file mode 100644 index 6c60777..0000000 --- a/.kiro/hooks/audit-reminder.kiro.hook +++ /dev/null @@ -1,15 +0,0 @@ -{ - "enabled": false, - "name": "Audit Reminder (Agent Stop, 15min)", - "description": "若检测到高风险改动且未审计,则在 agentStop 以 stderr+非0 形式提醒(15 分钟限频;不写 stdout)。", - "version": "1", - "when": { - "type": "agentStop" - }, - "then": { - "type": "runCommand", - "command": "python .kiro/scripts/audit_reminder.py" - }, - "workspaceFolderName": "NeoZQYY", - "shortName": "audit-reminder" -} \ No newline at end of file diff --git a/.kiro/hooks/change-compliance.kiro.hook b/.kiro/hooks/change-compliance.kiro.hook deleted file mode 100644 index 9806441..0000000 --- a/.kiro/hooks/change-compliance.kiro.hook +++ /dev/null @@ -1,15 +0,0 @@ -{ - "enabled": false, - "name": "Change Compliance Check (Agent Stop)", - "description": "对话结束时,审查本次变更的合规性:DB 迁移是否已执行、DDL 是否合并至基线、新增文件是否遵循 doc-map、代码修改是否有对应文档/审计记录。先运行预扫描脚本过滤,无需审查时静默跳过以节省 Token。", - "version": "1", - "when": { - "type": "agentStop" - }, - "then": { - "type": "askAgent", - "prompt": "先运行 `python .kiro/scripts/change_compliance_prescan.py` 获取预扫描结果。\n\n如果输出为 `NO_CHECK_NEEDED`,则回复「✅ 合规检查:无需审查项」,不做任何其他操作。\n\n如果输出为 JSON,则根据以下清单逐项审查并输出简短结论(每项一行,用 ✅/⚠️ 标记):\n\n1. **DB 迁移执行**:检查 `new_migration_sql` 中的 SQL 文件,连接测试库(pg_etl_test / pg_app_test)验证对应表/字段是否已存在。若未执行,标记 ⚠️ 并列出待执行文件。\n2. **DDL 基线合并**:若有迁移 SQL 但 `has_ddl_baseline` 为 false,检查 `docs/database/ddl/` 下对应基线文件是否已更新。\n3. **目录规范**:检查变更文件列表中的新增文件路径是否符合 doc-map 规范(模块专属放模块内、项目级放根目录、审计产物放 docs/audit/)。\n4. **文档同步**:检查 `code_without_docs` 列表,列出缺少对应文档更新的代码文件及其应更新的文档路径。\n\n输出格式极简,不超过 15 行。" - }, - "workspaceFolderName": "NeoZQYY", - "shortName": "change-compliance" -} \ No newline at end of file diff --git a/.kiro/hooks/cwd-guard-shell.kiro.hook b/.kiro/hooks/cwd-guard-shell.kiro.hook new file mode 100644 index 0000000..0e0383e --- /dev/null +++ b/.kiro/hooks/cwd-guard-shell.kiro.hook @@ -0,0 +1,16 @@ +{ + "enabled": true, + "name": "CWD Guard for Shell", + "description": "在 AI 执行 shell 命令前,检查是否在运行 Python 脚本。如果是,提醒 AI 确认 cwd 是否正确(仓库根 C:\\NeoZQYY),避免相对路径解析到错误位置。", + "version": "1", + "when": { + "type": "preToolUse", + "toolTypes": [ + "shell" + ] + }, + "then": { + "type": "askAgent", + "prompt": "如果即将执行的命令包含 `python` 且涉及 scripts/ops/、.kiro/scripts/、apps/etl/connectors/feiqiu/scripts/ 下的脚本,请确认:1) cwd 参数是否设置为仓库根目录 C:\\NeoZQYY;2) 脚本是否已有 ensure_repo_root() 校验。如果 cwd 不对且脚本无校验,请修正 cwd 后再执行。对于非 Python 命令或不涉及上述目录的命令,直接放行。" + } +} \ No newline at end of file diff --git a/.kiro/hooks/daily-revenue-report.kiro.hook b/.kiro/hooks/daily-revenue-report.kiro.hook new file mode 100644 index 0000000..131dd75 --- /dev/null +++ b/.kiro/hooks/daily-revenue-report.kiro.hook @@ -0,0 +1,15 @@ +{ + "enabled": true, + "name": "每日经营数据报告", + "description": "手动触发后执行 daily_revenue_report.py,统计 3月1日至当天的每日经营数据(实收、充值、团购结算、到店人次、新会员、充值人数等),输出到 docs/reports/daily-revenue-latest.md", + "version": "1", + "when": { + "type": "userTriggered" + }, + "then": { + "type": "askAgent", + "prompt": "执行 python C:\\NeoZQYY\\scripts\\ops\\daily_revenue_report.py" + }, + "workspaceFolderName": "NeoZQYY", + "shortName": "daily-revenue-report" +} \ No newline at end of file diff --git a/.kiro/hooks/dataflow-analyze.kiro.hook b/.kiro/hooks/dataflow-analyze.kiro.hook deleted file mode 100644 index 9b40617..0000000 --- a/.kiro/hooks/dataflow-analyze.kiro.hook +++ /dev/null @@ -1,15 +0,0 @@ -{ - "enabled": true, - "name": "Data Flow Structure Analysis", - "description": "手动触发数据流结构分析:先执行 Python 脚本采集 API JSON、DB 表结构、三层字段映射和 BD_manual 业务描述,再由报告生成器输出带锚点链接、业务描述、多示例值、白名单折叠和字段差异报告的 Markdown 文档。", - "version": "4.0.0", - "when": { - "type": "userTriggered" - }, - "then": { - "type": "askAgent", - "prompt": "执行数据流结构分析,按以下步骤完成。若发现已完成或有历史任务痕迹则清空,重新执行:\n\n第一阶段:数据采集\n1. 运行 `python scripts/ops/analyze_dataflow.py` 完成数据采集(如需指定日期范围,加 --date-from / --date-to 参数)\n2. 确认采集结果已落盘,包括:\n - json_trees/(含 samples 多示例值)\n - db_schemas/\n - field_mappings/(三层映射 + 锚点)\n - bd_descriptions/(BD_manual 业务描述)\n - collection_manifest.json(含 json_field_count、date_from、date_to)\n\n第二阶段:报告生成\n3. 运行 `python scripts/ops/gen_dataflow_report.py` 生成 Markdown 报告\n4. 报告包含以下增强内容:\n - 报告头含 API 请求日期范围(date_from ~ date_to)和 JSON 数据总量\n - 总览表含 API JSON 字段数列\n - 1.1 API↔ODS↔DWD 字段对比差异报告(白名单字段折叠汇总,不展开详细表格行)\n - 2.3 覆盖率表含业务描述列\n - API 源字段表含业务描述列 + 多示例值(枚举值解释)\n - ODS 表结构含业务描述列 + 上下游双向映射锚点链接\n - DWD 表结构含业务描述列 + ODS 来源锚点链接\n5. 输出文件路径和关键统计摘要\n\n白名单规则(v4):\n- ETL 元数据列(source_file, source_endpoint, fetched_at, payload, content_hash)\n- DWD 维表 SCD2 管理列(valid_from, valid_to, is_current, etl_loaded_at, etl_batch_id)\n- API siteProfile 嵌套对象字段\n- 白名单字段仍正常参与检查和统计,仅在报告中折叠显示并注明原因\n\n注意:当前仅分析飞球(feiqiu)连接器。未来新增连接器时,应自动发现并纳入分析范围。" - }, - "workspaceFolderName": "NeoZQYY", - "shortName": "dataflow-analyze" -} \ No newline at end of file diff --git a/.kiro/hooks/db-docs-sync.kiro.hook b/.kiro/hooks/db-docs-sync.kiro.hook deleted file mode 100644 index d0d50a0..0000000 --- a/.kiro/hooks/db-docs-sync.kiro.hook +++ /dev/null @@ -1,15 +0,0 @@ -{ - "enabled": true, - "name": "Manual: DB 文档全量同步", - "description": "按需触发:对比 Postgres 实际 schema 与 docs/database/ 下的文档,自动补全或更新缺失/过时的表结构说明,并输出变更摘要。", - "version": "1", - "when": { - "type": "userTriggered" - }, - "then": { - "type": "askAgent", - "prompt": "执行一次按需的数据库文档全量同步。\n\n步骤:\n1) 检查当前 Postgres schema(使用环境中可用的工具/命令,例如 pg_dump --schema-only 或查询 information_schema)。\n2) 与 docs/database 下现有文档进行对比。\n3) 更新缺失或过时的 schema/表结构文档。\n4) 输出对账摘要:哪些文档被修改了、修改原因。输出路径遵循.env路径定义。\n\n注意:如果需要执行 shell 命令,请通过 agent 的 shell 工具调用。" - }, - "workspaceFolderName": "NeoZQYY", - "shortName": "db-docs-sync" -} \ No newline at end of file diff --git a/.kiro/hooks/etl-data-consistency.kiro.hook b/.kiro/hooks/etl-data-consistency.kiro.hook deleted file mode 100644 index 4fb1507..0000000 --- a/.kiro/hooks/etl-data-consistency.kiro.hook +++ /dev/null @@ -1,15 +0,0 @@ -{ - "enabled": true, - "name": "ETL Data Consistency Check", - "description": "手动触发 ETL 全链路数据一致性黑盒检查:获取最近一次成功的 ETL 任务,对 API→ODS→DWD→DWS/INDEX 逐表逐字段进行实际数据比对,输出详细的数据差异报告。", - "version": "1.0.0", - "when": { - "type": "userTriggered" - }, - "then": { - "type": "askAgent", - "prompt": "执行 ETL 全链路数据一致性黑盒检查,按以下步骤完成,若发现已完成或有历史任务痕迹则清空,重新执行:\n\n1. 运行 `python scripts/ops/etl_consistency_check.py`\n2. 脚本会自动:\n a. 从 LOG_ROOT 找到最近一次成功的 ETL 日志,解析本次执行的任务列表\n b. 从 FETCH_ROOT 读取本次 ETL 落盘的 API JSON 文件\n c. 连接数据库(PG_DSN),对本次任务涉及的每张表逐字段比对:\n - API JSON vs ODS:字段完整性、值采样比对(随机 5 条记录的关键字段)\n - ODS vs DWD:字段映射正确性、值转换验证(采样比对)\n - DWD vs DWS/INDEX:聚合逻辑验证(行数、关键指标抽查)\n d. 输出 Markdown 报告到 ETL_REPORT_ROOT\n3. 检查报告输出,汇总关键发现\n\n报告结构:\n- 1. ETL 执行概览(任务列表、成功/失败/跳过统计)\n- 2. API↔ODS 数据一致性(逐表逐字段值比对)\n- 3. ODS↔DWD 数据一致性(映射验证 + 值采样)\n- 4. DWD↔DWS 数据一致性(聚合逻辑验证)\n- 5. 异常汇总与建议\n\n注意:使用正式库 PG_DSN 连接(只读模式),不修改任何数据。" - }, - "workspaceFolderName": "NeoZQYY", - "shortName": "etl-data-consistency" -} \ No newline at end of file diff --git a/.kiro/hooks/etl-fullstack-integration.kiro.hook b/.kiro/hooks/etl-fullstack-integration.kiro.hook new file mode 100644 index 0000000..4088cc0 --- /dev/null +++ b/.kiro/hooks/etl-fullstack-integration.kiro.hook @@ -0,0 +1,15 @@ +{ + "enabled": true, + "name": "ETL FULL TEST", + "description": "一键执行 ETL 全流程前后端联调:启动服务 → Playwright 浏览器提交任务 → 实时监控 → 性能报告 → 黑盒一致性测试 → 服务清理。详细步骤参考 .kiro/specs/[ETL]-fullstack-integration/tasks.md", + "version": "1.1.0", + "when": { + "type": "userTriggered" + }, + "then": { + "type": "askAgent", + "prompt": "执行 ETL 全栈联调运维任务。先读取 `.kiro/specs/[ETL]-fullstack-integration/tasks.md` 获取完整步骤细节,然后严格按以下 6 大步骤依次执行。全程使用 Playwright 浏览器模拟真实用户操作,不直接调用 API。\n\n## 步骤 1:服务启动与健康检查\n- 用 controlPwshProcess 启动后端:uvicorn app.main:app --host 0.0.0.0 --port 8000,cwd=apps/backend/\n- 用 controlPwshProcess 启动前端:pnpm dev,cwd=apps/admin-web/\n- 等待服务就绪,验证 http://localhost:8000/docs 和 http://localhost:5173 可访问\n- Playwright 打开 http://localhost:5173,登录(用户名 admin,密码 admin123)\n- 验证登录成功后跳转到任务配置页,侧边栏菜单正常渲染\n\n## 步骤 2:浏览器操作 - 任务配置与提交\n- 在任务配置页(/)依次操作:\n - Flow 选择 api_full(API → ODS → DWD → DWS → INDEX)\n - 处理模式选择 full_window\n - 时间窗口模式设为【自定义】,开始 2025-7-01,结束为当前时间\n - 窗口切分【按天】,切分天数 30\n - 勾选 force_full(强制全量)\n - 任务选择区域全选 is_common=True 的常用任务(共 41 个)\n- 确认 CLI 命令预览区显示完整参数\n- 点击【直接执行】按钮(SendOutlined 图标),触发 POST /api/execution/run\n- 确认提交成功提示,记录 execution_id\n\n## 步骤 3:执行监控与 DEBUG\n- 导航到【任务管理】页面(/task-manager)\n- 在【队列】Tab 确认任务状态为 running\n- 点击 running 任务行,打开 WebSocket 实时日志流抽屉\n- 按需以 30秒~20分钟 弹性间隔检查页面状态\n- 检测日志中的 ERROR / CRITICAL / Traceback / Exception / WARNING 关键字\n- 连续 20 分钟无新日志输出则报超时警告\n- 任务完成(success/failed/cancelled)时停止监控\n- 收集所有 ERROR 和 WARNING 日志行及上下文,分析错误类型\n- 如果任务失败,切换到【历史】Tab 查看完整执行详情\n\n## 步骤 4:性能计时与报告生成\n- 在【历史】Tab 点击已完成任务查看执行详情\n- 通过 GET /api/execution/{id}/logs 获取完整日志\n- 从日志提取每个窗口切片(30天)的开始/结束时间,计算耗时\n- 识别 ODS / DWD / DWS / INDEX 各阶段耗时,标注 Top-5 瓶颈\n- 生成综合联调报告到 {SYSTEM_LOG_ROOT}/{date}__etl_integration_report.md\n- 报告包含:执行概要、性能报告(各切片耗时对比、Top-5)、DEBUG 报告\n\n## 步骤 5:黑盒数据一致性测试\n- 运行全链路检查器:uv run python scripts/ops/etl_consistency_check.py(cwd=C:\\\\NeoZQYY)\n - 脚本自动从 LOG_ROOT 找最近 ETL 日志,从 FETCH_ROOT 读 API JSON\n - 连接数据库(PG_DSN)逐表逐字段比对:API vs ODS、ODS vs DWD、DWD vs DWS\n - 白名单:ETL_META_COLS、SCD2_COLS 排除;API 空字符串 vs DB None 视为等价\n - 报告输出到 ETL_REPORT_ROOT\n- 检查 FlowRunner 内置一致性报告(ETL_REPORT_ROOT 下已自动生成)\n- 对比两份报告结论是否一致\n- 将黑盒测试结果摘要追加到步骤 4 的综合报告中(通过/失败统计、白名单差异、失败表清单)\n\n## 步骤 6:服务清理\n- 关闭 Playwright 浏览器实例\n- 停止 uvicorn 后端进程(controlPwshProcess stop)\n- 停止 pnpm dev 前端进程(controlPwshProcess stop)\n- 报告联调完成状态\n\n## 环境与规范要求\n- 环境变量从根 .env 加载(load_dotenv),缺失必须报错,禁止静默回退\n- 数据库使用测试库(PG_DSN 指向 test_etl_feiqiu)\n- 报告路径遵循 export-paths 规范,从环境变量读取\n- 需要的环境变量:PG_DSN、FETCH_ROOT、LOG_ROOT、ETL_REPORT_ROOT、SYSTEM_LOG_ROOT" + }, + "workspaceFolderName": "NeoZQYY", + "shortName": "etl-fullstack-integration" +} \ No newline at end of file diff --git a/.kiro/hooks/etl-unified-analysis.kiro.hook b/.kiro/hooks/etl-unified-analysis.kiro.hook new file mode 100644 index 0000000..c05bb25 --- /dev/null +++ b/.kiro/hooks/etl-unified-analysis.kiro.hook @@ -0,0 +1,15 @@ +{ + "enabled": true, + "name": "ETL Unified Analysis", + "description": "手动触发 ETL 统一分析:合并数据流结构分析和数据一致性检查为一个流程。支持 --mode structure|consistency|full(默认 full),支持 --source api|etl-log(默认 api 主动采集最近 60 天)。", + "version": "1.0.0", + "when": { + "type": "userTriggered" + }, + "then": { + "type": "askAgent", + "prompt": "执行 ETL 统一分析,按以下步骤完成。若发现已完成或有历史任务痕迹则清空,重新执行:\n\n运行 `python scripts/ops/etl_unified_analysis.py`\n\n默认行为(full 模式):\n1. 第一阶段:数据流结构分析\n - 运行 analyze_dataflow.py 采集 API JSON、DB 表结构、三层字段映射、BD_manual 业务描述(默认最近 60 天)\n - 运行 gen_dataflow_report.py 生成结构分析报告\n2. 第二阶段:ETL 数据一致性检查\n - 运行 etl_consistency_check.py 对 API→ODS→DWD→DWS 逐表逐字段比对\n - 每张表展示数据截止日期(create_time/createtime/fetched_at 的 MAX 值)\n3. 第三阶段:报告合并\n - 将两份报告合并为一份统一报告,输出到 ETL_REPORT_ROOT\n\n可选参数:\n- `--mode structure` 仅执行结构分析\n- `--mode consistency` 仅执行一致性检查\n- `--source etl-log` 切换为读 ETL 落盘 JSON(而非主动调 API)\n- `--date-from YYYY-MM-DD` 指定起始日期\n- `--date-to YYYY-MM-DD` 指定截止日期\n- `--limit N` 每端点最大记录数\n- `--tables t1,t2` 指定分析的表\n\n白名单规则(继承 v5):\n- ETL 元数据列(source_file, source_endpoint, fetched_at, payload, content_hash)\n- DWD 维表 SCD2 管理列(valid_from, valid_to, is_current, etl_loaded_at, etl_batch_id)\n- API siteProfile 嵌套对象字段\n- 时间格式等价:同一时刻的不同格式表示视为内容相同\n- 白名单字段仍正常参与检查和统计,仅在报告中折叠显示并注明原因\n\n注意:\n- 当前仅分析飞球(feiqiu)连接器\n- 数据库使用测试库(TEST_DB_DSN),只读模式" + }, + "workspaceFolderName": "NeoZQYY", + "shortName": "etl-unified-analysis" +} diff --git a/.kiro/hooks/h5-screenshot.kiro.hook b/.kiro/hooks/h5-screenshot.kiro.hook new file mode 100644 index 0000000..9408cf8 --- /dev/null +++ b/.kiro/hooks/h5-screenshot.kiro.hook @@ -0,0 +1,13 @@ +{ + "enabled": true, + "name": "H5 原型截图", + "description": "手动触发:启动 HTTP 服务器 → 运行 screenshot_h5_pages.py 批量截取 docs/h5_ui/pages/ 下所有 H5 原型页面(iPhone 15 Pro Max, 430×932, DPR:3),输出到 docs/h5_ui/screenshots/。完成后关闭服务器。", + "version": "1", + "when": { + "type": "userTriggered" + }, + "then": { + "type": "askAgent", + "prompt": "执行 H5 原型页面批量截图流程:\n1. 启动 HTTP 服务器:`python -m http.server 8765 --directory docs/h5_ui/pages`(用 controlPwshProcess 后台启动,cwd 为 C:\\NeoZQYY)\n2. 等待 2 秒确认服务器就绪\n3. 运行截图脚本:`python C:\\NeoZQYY\\scripts\\ops\\screenshot_h5_pages.py`(cwd 为 C:\\NeoZQYY,timeout 180s)\n4. 检查输出:列出 docs/h5_ui/screenshots/*.png 的文件名和大小,确认数量和关键交互态截图大小合理\n5. 停止 HTTP 服务器(controlPwshProcess stop)\n6. 简要汇报结果:总截图数、像素尺寸验证(应为 1290×N)、异常文件(如有)" + } +} \ No newline at end of file diff --git a/.kiro/hooks/pre-change-guard.kiro.hook b/.kiro/hooks/pre-change-guard.kiro.hook new file mode 100644 index 0000000..21606bc --- /dev/null +++ b/.kiro/hooks/pre-change-guard.kiro.hook @@ -0,0 +1,16 @@ +{ + "enabled": false, + "name": "Pre-Change Research Guard", + "description": "在写操作执行前检查:是否已完成逻辑改动前置调研(审计历史、文档阅读、上下文摘要)。若未完成则阻止写入,先完成调研流程。", + "version": "1", + "when": { + "type": "preToolUse", + "toolTypes": [ + "write" + ] + }, + "then": { + "type": "askAgent", + "prompt": "你即将执行写操作。请确认:\n\n1. 本次写操作是否涉及逻辑改动(ETL/业务规则/API/数据模型/前端交互)?\n2. 如果涉及逻辑改动,你是否已通过 context-gatherer 子代理完成前置调研,并向用户输出了上下文摘要且获得确认?\n\n若属于例外情况(纯格式/注释/文档纯文字/配置文件/.kiro 目录/用户明确跳过/新建不涉及已有逻辑),可直接继续。\n若未完成前置调研,必须先停止写操作,使用 context-gatherer 子代理完成调研流程后再继续。" + } +} \ No newline at end of file diff --git a/.kiro/hooks/prompt-audit-log.kiro.hook b/.kiro/hooks/prompt-audit-log.kiro.hook deleted file mode 100644 index 018c21a..0000000 --- a/.kiro/hooks/prompt-audit-log.kiro.hook +++ /dev/null @@ -1,15 +0,0 @@ -{ - "enabled": false, - "name": "Prompt Audit Log (Shell)", - "description": "每次提交 prompt 时,用本地 Shell 在 docs/audit/prompt_logs/ 生成独立日志文件(按时间戳命名);不触发 LLM,避免上下文膨胀。", - "version": "3", - "when": { - "type": "promptSubmit" - }, - "then": { - "type": "runCommand", - "command": "python .kiro/scripts/prompt_audit_log.py" - }, - "workspaceFolderName": "NeoZQYY", - "shortName": "prompt-audit-log" -} diff --git a/.kiro/hooks/prompt-on-submit.kiro.hook b/.kiro/hooks/prompt-on-submit.kiro.hook index 06ebd04..628c4ae 100644 --- a/.kiro/hooks/prompt-on-submit.kiro.hook +++ b/.kiro/hooks/prompt-on-submit.kiro.hook @@ -8,7 +8,7 @@ }, "then": { "type": "runCommand", - "command": "python .kiro/scripts/prompt_on_submit.py" + "command": "python C:/NeoZQYY/.kiro/scripts/prompt_on_submit.py" }, "workspaceFolderName": "NeoZQYY", "shortName": "prompt-on-submit" diff --git a/.kiro/hooks/run-audit-writer.kiro.hook b/.kiro/hooks/run-audit-writer.kiro.hook index 1dddbd4..923ebd5 100644 --- a/.kiro/hooks/run-audit-writer.kiro.hook +++ b/.kiro/hooks/run-audit-writer.kiro.hook @@ -1,14 +1,14 @@ { "enabled": true, "name": "Manual: Run /audit (via audit-writer subagent)", - "description": "按需触发:读取 agent-on-stop 预构建的审计上下文,启动 audit-writer 子代理执行审计落盘+文档校对。上下文过期时自动重建。", - "version": "5", + "description": "按需触发:读取 agent-on-stop 预构建的审计上下文 + Session 索引,启动 audit-writer 子代理执行审计落盘+文档校对+DB文档全量对账+Session关联。上下文过期时自动重建。", + "version": "11", "when": { "type": "userTriggered" }, "then": { "type": "askAgent", - "prompt": "执行 /audit 审计流程:\n\n**前置检查**:读取 `.kiro/.audit_context.json`,检查 `built_at` 时间戳。若文件不存在或 `built_at` 超过 30 分钟,先运行 `python .kiro/scripts/agent_on_stop.py` 重建上下文,再重新读取。\n\n**主流程**:启动名为 audit-writer 的子代理,传入以下指令:\n\n> 读取 `.kiro/.audit_context.json` 作为唯一输入,不要自行运行 git status/diff/扫描文件。该文件已包含:变更文件列表、高风险文件 diff、合规检查清单(文档缺失/迁移状态/DDL 基线)、外部变更文件列表(external_files)、Prompt-ID 溯源。按 audit-writer.md 中定义的执行策略完成审计落盘+文档校对补齐。\n\n约束:\n- 子代理禁止重复运行 git status --porcelain 或 git diff 全量扫描,所有信息已在 .audit_context.json 中预备好。\n- 子代理需要读取具体文件内容时(如更新文档),可以直接读取对应文件,但不要做全仓库遍历。\n- 子代理必须按需调用 skill:steering-readme-maintainer、change-annotation-audit、bd-manual-db-docs(仅在满足触发条件时)。\n- 子代理必须根据 compliance.code_without_docs 自动补齐缺失的文档同步。\n- 若 external_files 非空,在审计记录中增加「外部变更」段落,列出这些文件并标注来源为非 Kiro 操作。\n- 所有审计产物统一写入 docs/audit/,不写入子模块内部。\n- 完成后把 .kiro/.audit_state.json 中 audit_required 置为 false。\n- 执行 `python scripts/audit/gen_audit_dashboard.py` 刷新审计一览表。\n- 最终回复必须是极短回执:done/files_written/next_step。" + "prompt": "执行 /audit 审计流程:\n\n**第零步:获取当前时间**:运行 `python -c \"from datetime import datetime, timezone, timedelta; print(datetime.now(timezone(timedelta(hours=8))).isoformat())\"` 获取当前北京时间,记为 `now`。后续所有「超过 30 分钟」的判断以此 `now` 为基准。\n\n**前置检查**:读取 `.kiro/state/.audit_context.json`,检查 `built_at` 时间戳。若文件不存在或 `built_at` 距 `now` 超过 30 分钟,先运行 `python .kiro/scripts/agent_on_stop.py --force-rebuild` 重建上下文,再重新读取。\n\n**Session 索引读取**:读取 `docs/audit/session_logs/_session_index.json`,找到与本次对话时间最接近的 entry(按 `startTime` 匹配),提取其 `description`(LLM 操作摘要)和 `summary`(结构化摘要)。这些信息将用于:\n- 作为审计记录头部的「操作摘要」来源(比从 diff 推断更准确)\n- 交叉验证 audit_context.json 中的 session_diff(files_modified/created)\n- 记录本次审计关联的 session executionId,建立双向链接\n\n**主流程**:启动名为 audit-writer 的子代理,传入以下指令:\n\n> 读取 `.kiro/state/.audit_context.json` 作为主输入,同时参考 Session 索引中匹配的 entry。不要自行运行 git status/diff/扫描文件。audit_context.json 已包含:变更文件列表、高风险文件 diff、合规检查清单(文档缺失/迁移状态/DDL 基线/接口变更/OpenAPI spec 状态)、本次对话精确变更(session_diff: added/modified/deleted)、Prompt-ID 溯源。按 audit-writer.md 中定义的执行策略完成审计落盘+文档校对补齐。\n\n约束:\n- 子代理禁止重复运行 git status --porcelain 或 git diff 全量扫描,所有信息已在 .audit_context.json 中预备好。\n- 子代理需要读取具体文件内容时(如更新文档),可以直接读取对应文件,但不要做全仓库遍历。\n- 子代理必须按需调用 skill:steering-readme-maintainer、change-annotation-audit、bd-manual-db-docs(仅在满足触发条件时)。\n- 子代理必须根据 compliance.code_without_docs 自动补齐缺失的文档同步。\n- 当 reasons 含 db-schema-change 时,子代理必须执行 DB 文档全量对账:连接测试库(TEST_DB_DSN)查询 information_schema,与 docs/database/ 下现有文档全量对比,补全或更新所有缺失/过时的表结构说明(不仅限于本次变更涉及的表),输出对账摘要。\n- 子代理应参考 session_diff 中的 added/modified/deleted 列表,精确定位本次对话的变更范围。\n- **Session 关联**:在审计记录(docs/audit/changes/*.md)头部增加 `session_id` 字段(executionId 前 8 位),并将 Session 索引中的 description 作为「操作摘要」写入审计记录。这建立了审计记录 ↔ Session 日志的双向链接。\n- 子代理必须为所有变更文件生成改动注解(步骤 5),写入审计记录的「改动注解」段落,包含:变更类型、原始原因、思路分析、修改结果。高风险文件写详细注解,普通修改写简要一行,删除文件只记录原因。\n- 若 compliance.api_changed=true 且 compliance.openapi_spec_stale=true,运行 `python scripts/ops/_export_openapi.py` 重新导出 OpenAPI spec;导出失败则在审计记录标注待手动导出;导出成功则提醒用户重连 OpenAPI Power MCP server。\n- 所有审计产物统一写入 docs/audit/,不写入子模块内部。\n- 完成后把 .kiro/state/.audit_state.json 中 audit_required 置为 false。\n- 执行 `python scripts/audit/gen_audit_dashboard.py` 刷新审计一览表。\n- **文档地图更新**:审计完成后,自动更新 `docs/DOCUMENTATION-MAP.md`:\n - 检查本次审计涉及的文档变更(从审计记录中识别)\n - 扫描 `docs/` 目录和各模块内部文档的变化(新增、修改、删除)\n - 特别关注数据库文档(`docs/database/`)是否有新增的 BD_Manual 文件\n - 根据发现的文档变更,更新文档地图中的相应条目\n - 确保文档地图的结构完整,所有重要文档都有记录\n- 最终回复必须是极短回执:done/files_written/next_step。" }, "workspaceFolderName": "NeoZQYY", "shortName": "audit" diff --git a/.kiro/hooks/session-log.kiro.hook b/.kiro/hooks/session-log.kiro.hook deleted file mode 100644 index 0a873d8..0000000 --- a/.kiro/hooks/session-log.kiro.hook +++ /dev/null @@ -1,15 +0,0 @@ -{ - "enabled": false, - "name": "Session Log (Agent Stop)", - "description": "每次对话结束时,记录本次对话的完整日志(用户输入、agent 输出、变更文件、git diff stat)到 docs/audit/session_logs/。纯 Shell 执行,不触发 LLM。", - "version": "1", - "when": { - "type": "agentStop" - }, - "then": { - "type": "runCommand", - "command": "python .kiro/scripts/session_log.py" - }, - "workspaceFolderName": "NeoZQYY", - "shortName": "session-log" -} \ No newline at end of file diff --git a/.kiro/hooks/session-summary.kiro.hook b/.kiro/hooks/session-summary.kiro.hook new file mode 100644 index 0000000..f723413 --- /dev/null +++ b/.kiro/hooks/session-summary.kiro.hook @@ -0,0 +1,15 @@ +{ + "enabled": true, + "name": "Session description maker", + "description": "手动触发:为缺少 description 的 session log 调用百炼千问 API 生成摘要,写入双索引。askAgent 模式可看到实时输出。", + "version": "1", + "when": { + "type": "userTriggered" + }, + "then": { + "type": "askAgent", + "prompt": "请在后台运行以下命令并展示实时输出:python -B C:/NeoZQYY/scripts/ops/batch_generate_summaries.py" + }, + "workspaceFolderName": "NeoZQYY", + "shortName": "session-summary" +} \ No newline at end of file diff --git a/.kiro/scripts/_ensure_root.py b/.kiro/scripts/_ensure_root.py new file mode 100644 index 0000000..ad6bf45 --- /dev/null +++ b/.kiro/scripts/_ensure_root.py @@ -0,0 +1,39 @@ +# -*- coding: utf-8 -*- +"""cwd 校验工具 — .kiro/scripts/ 下所有脚本共享。 + +用法: + from _ensure_root import ensure_repo_root + ensure_repo_root() + +委托给 neozqyy_shared.repo_root(共享包),未安装时 fallback。 +""" +from __future__ import annotations + +import os +import warnings +from pathlib import Path + + +def ensure_repo_root() -> Path: + """校验 cwd 是否为仓库根目录,不是则自动切换。""" + try: + from neozqyy_shared.repo_root import ensure_repo_root as _shared + return _shared() + except ImportError: + pass + # fallback + cwd = Path.cwd() + if (cwd / "pyproject.toml").is_file() and (cwd / ".kiro").is_dir(): + return cwd + root = Path(__file__).resolve().parents[2] + if (root / "pyproject.toml").is_file() and (root / ".kiro").is_dir(): + os.chdir(root) + warnings.warn( + f"cwd 不是仓库根目录,已自动切换: {cwd} → {root}", + stacklevel=2, + ) + return root + raise RuntimeError( + f"无法定位仓库根目录。当前 cwd={cwd},推断 root={root}。" + f"请在仓库根目录下运行脚本。" + ) diff --git a/.kiro/scripts/agent_on_stop.py b/.kiro/scripts/agent_on_stop.py index 9607323..e3ecc21 100644 --- a/.kiro/scripts/agent_on_stop.py +++ b/.kiro/scripts/agent_on_stop.py @@ -1,14 +1,16 @@ #!/usr/bin/env python3 -"""agent_on_stop — agentStop 合并 hook 脚本。 +"""agent_on_stop — agentStop 合并 hook 脚本(v3:含 LLM 摘要生成)。 -合并原 audit_reminder + session_log + change_compliance_prescan + build_audit_context: -1. 检测变更(对比 promptSubmit 快照,识别非 Kiro 变更) -2. 若无任何文件变更 → 跳过所有审查,静默退出 -3. 记录 session log → docs/audit/session_logs/ -4. 合规预扫描 → .kiro/.compliance_state.json -5. 构建审计上下文 → .kiro/.audit_context.json -6. 审计提醒(15 分钟限频)→ stderr +合并原 audit_reminder + change_compliance_prescan + build_audit_context + session_extract: +1. 全量会话记录提取 → docs/audit/session_logs/(无论是否有代码变更) +2. 为刚提取的 session 调用百炼 API 生成 description → 写入双索引 +3. 扫描工作区 → 与 promptSubmit 基线对比 → 精确检测本次对话变更 +4. 若无任何文件变更 → 跳过审查,静默退出 +5. 合规预扫描 → .kiro/state/.compliance_state.json +6. 构建审计上下文 → .kiro/state/.audit_context.json +7. 审计提醒(15 分钟限频)→ stderr +变更检测基于文件 mtime+size 基线对比,不依赖 git commit 历史。 所有功能块用 try/except 隔离,单个失败不影响其他。 """ @@ -20,18 +22,20 @@ import subprocess import sys from datetime import datetime, timezone, timedelta +# 同目录导入文件基线模块 + cwd 校验 +sys.path.insert(0, os.path.dirname(os.path.abspath(__file__))) +from file_baseline import scan_workspace, load_baseline, diff_baselines, total_changes +from _ensure_root import ensure_repo_root + TZ_TAIPEI = timezone(timedelta(hours=8)) MIN_INTERVAL = timedelta(minutes=15) # 路径常量 -STATE_PATH = os.path.join(".kiro", ".audit_state.json") -SNAPSHOT_PATH = os.path.join(".kiro", ".git_snapshot.json") -COMPLIANCE_PATH = os.path.join(".kiro", ".compliance_state.json") -CONTEXT_PATH = os.path.join(".kiro", ".audit_context.json") -PROMPT_ID_PATH = os.path.join(".kiro", ".last_prompt_id.json") -SESSION_LOG_DIR = os.path.join("docs", "audit", "session_logs") - -# 噪声路径 +STATE_PATH = os.path.join(".kiro", "state", ".audit_state.json") +COMPLIANCE_PATH = os.path.join(".kiro", "state", ".compliance_state.json") +CONTEXT_PATH = os.path.join(".kiro", "state", ".audit_context.json") +PROMPT_ID_PATH = os.path.join(".kiro", "state", ".last_prompt_id.json") +# 噪声路径(用于过滤变更列表中的非业务文件) NOISE_PATTERNS = [ re.compile(r"^docs/audit/"), re.compile(r"^\.kiro/"), @@ -53,9 +57,11 @@ HIGH_RISK_PATTERNS = [ # 文档映射(合规检查用) DOC_MAP = { - "apps/backend/app/routers/": ["apps/backend/docs/API-REFERENCE.md"], + "apps/backend/app/routers/": ["apps/backend/docs/API-REFERENCE.md", "docs/contracts/openapi/backend-api.json"], "apps/backend/app/services/": ["apps/backend/docs/API-REFERENCE.md", "apps/backend/README.md"], - "apps/backend/app/auth/": ["apps/backend/docs/API-REFERENCE.md", "apps/backend/README.md"], + "apps/backend/app/auth/": ["apps/backend/docs/API-REFERENCE.md", "apps/backend/README.md", "docs/contracts/openapi/backend-api.json"], + "apps/backend/app/schemas/": ["docs/contracts/openapi/backend-api.json"], + "apps/backend/app/main.py": ["docs/contracts/openapi/backend-api.json"], "apps/etl/connectors/feiqiu/tasks/": ["apps/etl/connectors/feiqiu/docs/etl_tasks/"], "apps/etl/connectors/feiqiu/loaders/": ["apps/etl/connectors/feiqiu/docs/etl_tasks/"], "apps/etl/connectors/feiqiu/scd/": ["apps/etl/connectors/feiqiu/docs/business-rules/scd2_rules.md"], @@ -65,6 +71,14 @@ DOC_MAP = { "packages/shared/": ["packages/shared/README.md"], } +# 接口变更检测模式(routers / auth / schemas / main.py) +API_CHANGE_PATTERNS = [ + re.compile(r"^apps/backend/app/routers/"), + re.compile(r"^apps/backend/app/auth/"), + re.compile(r"^apps/backend/app/schemas/"), + re.compile(r"^apps/backend/app/main\.py$"), +] + MIGRATION_PATTERNS = [ re.compile(r"^db/etl_feiqiu/migrations/.*\.sql$"), re.compile(r"^db/zqyy_app/migrations/.*\.sql$"), @@ -99,34 +113,11 @@ def safe_read_json(path): def write_json(path, data): - os.makedirs(os.path.dirname(path) or ".kiro", exist_ok=True) + os.makedirs(os.path.dirname(path) or os.path.join(".kiro", "state"), exist_ok=True) with open(path, "w", encoding="utf-8") as f: json.dump(data, f, indent=2, ensure_ascii=False) -def get_changed_files() -> list[str]: - try: - r = subprocess.run( - ["git", "status", "--porcelain"], - capture_output=True, text=True, encoding="utf-8", errors="replace", timeout=10 - ) - if r.returncode != 0: - return [] - except Exception: - return [] - files = [] - for line in r.stdout.splitlines(): - if len(line) < 4: - continue - path = line[3:].strip() - if " -> " in path: - path = path.split(" -> ")[-1] - path = path.strip().strip('"').replace("\\", "/") - if path: - files.append(path) - return sorted(set(files)) - - def git_diff_stat(): try: r = subprocess.run( @@ -138,7 +129,8 @@ def git_diff_stat(): return "" -def git_diff_files(files, max_total=30000): +def git_diff_files(files, max_total=30000, max_per_file=15000): + """获取文件的实际 diff 内容。对已跟踪文件用 git diff HEAD,对新文件直接读取内容。""" if not files: return "" all_diff = [] @@ -148,14 +140,26 @@ def git_diff_files(files, max_total=30000): all_diff.append(f"\n[TRUNCATED: diff exceeds {max_total // 1000}KB]") break try: + # 先尝试 git diff HEAD r = subprocess.run( ["git", "diff", "HEAD", "--", f], capture_output=True, text=True, encoding="utf-8", errors="replace", timeout=10 ) + chunk = "" if r.returncode == 0 and r.stdout.strip(): chunk = r.stdout.strip() - if len(chunk) > 5000: - chunk = chunk[:5000] + f"\n[TRUNCATED: {f} diff too long]" + elif os.path.isfile(f): + # untracked 新文件:直接读取内容作为 diff + try: + with open(f, "r", encoding="utf-8", errors="replace") as fh: + file_content = fh.read(max_per_file + 100) + chunk = f"--- /dev/null\n+++ b/{f}\n@@ -0,0 +1 @@\n" + file_content + except Exception: + continue + + if chunk: + if len(chunk) > max_per_file: + chunk = chunk[:max_per_file] + f"\n[TRUNCATED: {f} diff too long]" all_diff.append(chunk) total_len += len(chunk) except Exception: @@ -181,108 +185,49 @@ def get_latest_prompt_log(): return "" -# ── 步骤 1:检测变更,识别非 Kiro 变更 ── -def detect_changes(current_files): - """对比 promptSubmit 快照,返回 (real_files, external_files, no_change)""" - snapshot = safe_read_json(SNAPSHOT_PATH) - snapshot_files = set(snapshot.get("files", [])) - current_set = set(current_files) +# ── 步骤 1:基于文件基线检测变更 ── +def detect_changes_via_baseline(): + """扫描当前工作区,与 promptSubmit 基线对比,返回精确的变更列表。 - # 排除噪声后的真实变更 - real_files = sorted(f for f in current_files if not is_noise(f)) + 返回 (all_changed_files, external_files, diff_result, no_change) + - all_changed_files: 本次对话期间所有变更文件(added + modified) + - external_files: 暂时等于 all_changed_files(后续可通过 Kiro 写入日志细化) + - diff_result: 完整的 diff 结果 {added, modified, deleted} + - no_change: 是否无任何变更 + """ + before = load_baseline() + after = scan_workspace(".") + + if not before: + # 没有基线(首次运行或基线丢失),无法对比,回退到全部文件 + return [], [], {"added": [], "modified": [], "deleted": []}, True + + diff = diff_baselines(before, after) + count = total_changes(diff) + + if count == 0: + return [], [], diff, True + + # 所有变更文件 = added + modified(deleted 的文件已不存在,不参与风险判定) + all_changed = sorted(set(diff["added"] + diff["modified"])) + + # 过滤噪声 + real_files = [f for f in all_changed if not is_noise(f)] if not real_files: - return [], [], True + return [], [], diff, True - # 检测非 Kiro 变更:在 agentStop 时出现但 promptSubmit 快照中没有的文件 - # 这些是对话期间由外部操作(CLI、脚本等)产生的变更 - new_since_submit = current_set - snapshot_files - external_files = sorted(f for f in new_since_submit if not is_noise(f)) + # 外部变更:目前所有基线检测到的变更都记录, + # 因为 Kiro 的写入也会改变 mtime,所以这里的"外部"含义是 + # "本次对话期间发生的所有变更",包括 Kiro 和非 Kiro 的。 + # 精确区分需要 Kiro 运行时提供写入文件列表,目前不可用。 + external_files = [] # 不再误报外部变更 - return real_files, external_files, False + return real_files, external_files, diff, False -# ── 步骤 2:Session Log ── -def do_session_log(now, changed_files, external_files): - agent_output = os.environ.get("AGENT_OUTPUT", "") - user_prompt = os.environ.get("USER_PROMPT", "") - prompt_info = safe_read_json(PROMPT_ID_PATH) - audit_state = safe_read_json(STATE_PATH) - prompt_id = prompt_info.get("prompt_id", "unknown") - max_len = 50000 - if len(agent_output) > max_len: - agent_output = agent_output[:max_len] + "\n\n[TRUNCATED: output exceeds 50KB]" - if len(user_prompt) > 10000: - user_prompt = user_prompt[:10000] + "\n\n[TRUNCATED: prompt exceeds 10KB]" - diff_stat = git_diff_stat() - git_status = "" - try: - r = subprocess.run( - ["git", "status", "--short"], - capture_output=True, text=True, encoding="utf-8", errors="replace", timeout=10 - ) - git_status = r.stdout.strip() if r.returncode == 0 else "" - except Exception: - pass - - os.makedirs(SESSION_LOG_DIR, exist_ok=True) - filename = f"session_{now.strftime('%Y%m%d_%H%M%S')}.md" - - # 外部变更标记 - external_section = "" - if external_files: - ext_list = "\n".join(external_files[:30]) - external_section = f""" -## External Changes (non-Kiro, {len(external_files)} files) - -以下文件在本次对话期间由外部操作(CLI/脚本/手动编辑)产生: - -``` -{ext_list} -``` -""" - - content = f"""# Session Log — {now.strftime('%Y-%m-%d %H:%M:%S %z')} - -- Prompt-ID: `{prompt_id}` -- Audit Required: `{audit_state.get('audit_required', 'N/A')}` -- Reasons: {', '.join(audit_state.get('reasons', [])) or 'none'} -- External Changes: {len(external_files)} files - -## User Input - -```text -{user_prompt or '(not captured)'} -``` - -## Agent Output - -```text -{agent_output or '(not captured)'} -``` - -## Changed Files ({len(changed_files)}) - -``` -{chr(10).join(changed_files[:80]) if changed_files else '(none)'} -``` -{external_section} -## Git Diff Stat - -``` -{diff_stat} -``` - -## Git Status - -``` -{git_status or '(clean)'} -``` -""" - with open(os.path.join(SESSION_LOG_DIR, filename), "w", encoding="utf-8") as f: - f.write(content) # ── 步骤 3:合规预扫描 ── @@ -295,6 +240,8 @@ def do_compliance_prescan(all_files): "has_bd_manual": False, "has_audit_record": False, "has_ddl_baseline": False, + "api_changed": False, + "openapi_spec_stale": False, } code_files = [] @@ -319,6 +266,15 @@ def do_compliance_prescan(all_files): doc_files.add(f) if f.endswith((".py", ".ts", ".tsx", ".js", ".jsx")): code_files.append(f) + # 检测接口相关文件变更 + for ap in API_CHANGE_PATTERNS: + if ap.search(f): + result["api_changed"] = True + break + + # 接口变更但 openapi spec 未同步更新 → 标记过期 + if result["api_changed"] and "docs/contracts/openapi/backend-api.json" not in all_files: + result["openapi_spec_stale"] = True for cf in code_files: expected_docs = [] @@ -343,6 +299,7 @@ def do_compliance_prescan(all_files): needs_check = bool( result["new_migration_sql"] or result["code_without_docs"] + or result["openapi_spec_stale"] ) now = now_taipei() @@ -355,14 +312,21 @@ def do_compliance_prescan(all_files): # ── 步骤 4:构建审计上下文 ── -def do_build_audit_context(all_files, external_files, compliance): +def do_build_audit_context(all_files, diff_result, compliance): now = now_taipei() audit_state = safe_read_json(STATE_PATH) prompt_info = safe_read_json(PROMPT_ID_PATH) - changed_files = audit_state.get("changed_files", all_files[:50]) + # 使用 audit_state 中的 changed_files(来自 git status 的风险文件) + # 与本次对话的 baseline diff 合并 + git_changed = audit_state.get("changed_files", []) + session_changed = all_files # 本次对话期间变更的文件 + + # 合并两个来源,去重 + all_changed = sorted(set(git_changed + session_changed)) + high_risk_files = [ - f for f in changed_files + f for f in all_changed if any(p.search(f) for p in HIGH_RISK_PATTERNS) ] @@ -377,15 +341,21 @@ def do_build_audit_context(all_files, external_files, compliance): "audit_required": audit_state.get("audit_required", False), "db_docs_required": audit_state.get("db_docs_required", False), "reasons": audit_state.get("reasons", []), - "changed_files": changed_files, + "changed_files": all_changed[:100], "high_risk_files": high_risk_files, - "external_files": external_files, + "session_diff": { + "added": diff_result.get("added", [])[:50], + "modified": diff_result.get("modified", [])[:50], + "deleted": diff_result.get("deleted", [])[:50], + }, "compliance": { "code_without_docs": compliance.get("code_without_docs", []), "new_migration_sql": compliance.get("new_migration_sql", []), "has_bd_manual": compliance.get("has_bd_manual", False), "has_audit_record": compliance.get("has_audit_record", False), "has_ddl_baseline": compliance.get("has_ddl_baseline", False), + "api_changed": compliance.get("api_changed", False), + "openapi_spec_stale": compliance.get("openapi_spec_stale", False), }, "diff_stat": diff_stat, "high_risk_diff": high_risk_diff, @@ -401,13 +371,8 @@ def do_audit_reminder(real_files): if not state.get("audit_required"): return - # 工作树干净时清除 + # 无变更时不提醒 if not real_files: - state["audit_required"] = False - state["reasons"] = [] - state["changed_files"] = [] - state["last_reminded_at"] = None - write_json(STATE_PATH, state) return now = now_taipei() @@ -425,66 +390,253 @@ def do_audit_reminder(real_files): reasons = state.get("reasons", []) reason_text = ", ".join(reasons) if reasons else "high-risk paths changed" - ext_note = "" - # 从 context 读取外部变更数量 - ctx = safe_read_json(CONTEXT_PATH) - ext_count = len(ctx.get("external_files", [])) - if ext_count: - ext_note = f" (includes {ext_count} external/non-Kiro changes)" + # 仅信息性提醒,exit(0) 避免 agent 将其视为错误并自行执行审计 + # 审计留痕统一由用户手动触发 /audit 完成 sys.stderr.write( - f"[AUDIT REMINDER] Pending audit ({reason_text}){ext_note}. " + f"[AUDIT REMINDER] Pending audit ({reason_text}), " + f"{len(real_files)} files changed this session. " f"Run /audit to sync. (15min rate limit)\n" ) - sys.exit(1) + sys.exit(0) + + +# ── 步骤 6:全量会话记录提取 ── +def do_full_session_extract(): + """从 Kiro globalStorage 提取当前 execution 的全量对话记录。 + 调用 scripts/ops/extract_kiro_session.py 的核心逻辑。 + 仅提取最新一条未索引的 execution,避免重复。 + """ + # 动态导入提取器(避免启动时 import 开销) + scripts_ops = os.path.join(os.path.dirname(os.path.abspath(__file__)), "..", "..", "scripts", "ops") + scripts_ops = os.path.normpath(scripts_ops) + if scripts_ops not in sys.path: + sys.path.insert(0, scripts_ops) + + try: + from extract_kiro_session import extract_latest + except ImportError: + return # 提取器不存在则静默跳过 + + # globalStorage 路径:从环境变量或默认位置 + global_storage = os.environ.get( + "KIRO_GLOBAL_STORAGE", + os.path.join(os.environ.get("APPDATA", ""), "Kiro", "User", "globalStorage") + ) + workspace_path = os.getcwd() + + extract_latest(global_storage, workspace_path) + + +def _extract_summary_content(md_content: str) -> str: + """从 session log markdown 中提取适合生成摘要的内容。 + + 策略:如果"用户输入"包含 CONTEXT TRANSFER(跨轮续接), + 则替换为简短标注,避免历史背景干扰本轮摘要生成。 + """ + import re + # 检测用户输入中是否包含 context transfer + ct_pattern = re.compile(r"## 2\. 用户输入\s*\n```\s*\n.*?CONTEXT TRANSFER", re.DOTALL) + if ct_pattern.search(md_content): + # 替换"用户输入"section 为简短标注 + # 匹配从 "## 2. 用户输入" 到下一个 "## 3." 之间的内容 + md_content = re.sub( + r"(## 2\. 用户输入)\s*\n```[\s\S]*?```\s*\n(?=## 3\.)", + r"\1\n\n[本轮为 Context Transfer 续接,用户输入为历史多轮摘要,已省略。请基于执行摘要和对话记录中的实际工具调用判断本轮工作。]\n\n", + md_content, + ) + return md_content + + +# ── 步骤 7:为最新 session 生成 LLM 摘要 ── +_SUMMARY_SYSTEM_PROMPT = """你是一个专业的技术对话分析师。你的任务是为 AI 编程助手的一轮执行(execution)生成简洁的中文摘要。 + +背景:一个对话(chatSession)包含多轮执行(execution)。每轮执行 = 用户发一条消息 → AI 完成响应。你收到的是单轮执行的完整记录。 + +摘要规则: +1. 只描述本轮执行实际完成的工作,不要描述历史背景 +2. 列出完成的功能点/任务(一轮可能完成多个) +3. 包含关键技术细节:文件路径、模块名、数据库表、API 端点等 +4. bug 修复要说明原因和方案 +5. 不写过程性描述("用户说..."),只写结果 +6. 内容太短或无实质内容的,写"无实质内容" +7. 不限字数,信息完整优先,避免截断失真 + +重要: +- "执行摘要"(📋)是最可靠的信息源,优先基于它判断本轮做了什么 +- 如果"用户输入"包含 CONTEXT TRANSFER,那是之前多轮的历史摘要,不是本轮工作 +- 对话记录中的实际工具调用和文件变更才是本轮的真实操作 + +请直接输出摘要,不要添加任何前缀或解释。""" + + +def do_generate_description(): + """为缺少 description 的主对话 entry 调用百炼 API 生成摘要,写入双索引。""" + from dotenv import load_dotenv + load_dotenv() + + api_key = os.environ.get("BAILIAN_API_KEY", "") + if not api_key: + return + + model = os.environ.get("BAILIAN_MODEL", "qwen-plus") + base_url = os.environ.get("BAILIAN_BASE_URL", "https://dashscope.aliyuncs.com/compatible-mode/v1") + + scripts_ops = os.path.join(os.path.dirname(os.path.abspath(__file__)), "..", "..", "scripts", "ops") + scripts_ops = os.path.normpath(scripts_ops) + if scripts_ops not in sys.path: + sys.path.insert(0, scripts_ops) + + try: + from extract_kiro_session import load_index, save_index, load_full_index, save_full_index + except ImportError: + return + + index = load_index() + entries = index.get("entries", {}) + if not entries: + return + + # 收集所有缺少 description 的主对话 entry + targets = [] + for eid, ent in entries.items(): + if ent.get("is_sub"): + continue + if not ent.get("description"): + targets.append((eid, ent)) + + if not targets: + return + + # agent_on_stop 场景下限制处理数量,避免超时 + # 批量处理积压用独立脚本 batch_generate_summaries.py + MAX_PER_RUN = 10 + if len(targets) > MAX_PER_RUN: + # 优先处理最新的(按 startTime 降序) + targets.sort(key=lambda t: t[1].get("startTime", ""), reverse=True) + targets = targets[:MAX_PER_RUN] + + try: + from openai import OpenAI + client = OpenAI(api_key=api_key, base_url=base_url) + except Exception: + return + + full_index = load_full_index() + full_entries = full_index.get("entries", {}) + generated = 0 + + for target_eid, target_entry in targets: + out_dir = target_entry.get("output_dir", "") + if not out_dir or not os.path.isdir(out_dir): + continue + + # 找到该 entry 对应的 main_*.md 文件 + main_files = sorted( + f for f in os.listdir(out_dir) + if f.startswith("main_") and f.endswith(".md") + and target_eid[:8] in f # 按 executionId 短码匹配 + ) + if not main_files: + # 回退:取目录下所有 main 文件 + main_files = sorted( + f for f in os.listdir(out_dir) + if f.startswith("main_") and f.endswith(".md") + ) + if not main_files: + continue + + content_parts = [] + for mf in main_files: + try: + with open(os.path.join(out_dir, mf), "r", encoding="utf-8") as fh: + content_parts.append(fh.read()) + except Exception: + continue + if not content_parts: + continue + + content = "\n\n---\n\n".join(content_parts) + content = _extract_summary_content(content) + if len(content) > 60000: + content = content[:60000] + "\n\n[TRUNCATED]" + + try: + resp = client.chat.completions.create( + model=model, + messages=[ + {"role": "system", "content": _SUMMARY_SYSTEM_PROMPT}, + {"role": "user", "content": f"请为以下单轮执行记录生成摘要:\n\n{content}"}, + ], + max_tokens=4096, + ) + description = resp.choices[0].message.content.strip() + except Exception: + continue # 单条失败不影响其他 + + if not description: + continue + + # 写入双索引(内存中) + entries[target_eid]["description"] = description + if target_eid in full_entries: + full_entries[target_eid]["description"] = description + generated += 1 + + # 批量保存 + if generated > 0: + save_index(index) + save_full_index(full_index) def main(): - # 非 git 仓库直接退出 - try: - r = subprocess.run( - ["git", "rev-parse", "--is-inside-work-tree"], - capture_output=True, text=True, encoding="utf-8", errors="replace", timeout=5 - ) - if r.returncode != 0: - return - except Exception: - return - + ensure_repo_root() now = now_taipei() - current_files = get_changed_files() + force_rebuild = "--force-rebuild" in sys.argv - # 步骤 1:检测变更 - real_files, external_files, no_change = detect_changes(current_files) - - # 无任何文件变更 → 跳过所有审查 - if no_change: - return - - # 步骤 2:Session Log(始终记录,包括外部变更) + # 全量会话记录提取(无论是否有文件变更,每次对话都要记录) try: - do_session_log(now, real_files, external_files) + do_full_session_extract() except Exception: pass - # 步骤 3:合规预扫描 + # 步骤 1:基于文件基线检测变更 + real_files, external_files, diff_result, no_change = detect_changes_via_baseline() + + # 无任何文件变更 → 跳过所有审查(除非 --force-rebuild) + if no_change and not force_rebuild: + return + + # --force-rebuild 且无变更时,仍需基于 git status 重建 context + if no_change and force_rebuild: + try: + compliance = do_compliance_prescan(real_files or []) + except Exception: + compliance = {} + try: + do_build_audit_context(real_files or [], diff_result, compliance) + except Exception: + pass + return + + # 步骤 2:合规预扫描(基于本次对话变更的文件) compliance = {} try: - compliance = do_compliance_prescan(current_files) + compliance = do_compliance_prescan(real_files) except Exception: pass - # 步骤 4:构建审计上下文(预备 /audit 使用) + # 步骤 4:构建审计上下文 try: - do_build_audit_context(current_files, external_files, compliance) + do_build_audit_context(real_files, diff_result, compliance) except Exception: pass - # 步骤 5:审计提醒(最后执行,可能 sys.exit(1)) + # 步骤 7:审计提醒(信息性,exit(0),不触发 agent 自行审计) try: do_audit_reminder(real_files) except SystemExit: - raise + pass # exit(0) 信息性退出,不需要 re-raise except Exception: pass diff --git a/.kiro/scripts/audit_flagger.py b/.kiro/scripts/audit_flagger.py index dcf4b5c..bd0dd21 100644 --- a/.kiro/scripts/audit_flagger.py +++ b/.kiro/scripts/audit_flagger.py @@ -1,5 +1,5 @@ #!/usr/bin/env python3 -"""audit_flagger — 判断 git 工作区是否存在高风险改动,写入 .kiro/.audit_state.json +"""audit_flagger — 判断 git 工作区是否存在高风险改动,写入 .kiro/state/.audit_state.json 替代原 PowerShell 版本,避免 Windows PowerShell 5.1 解析器 bug。 """ @@ -37,7 +37,7 @@ DB_PATTERNS = [ re.compile(r"\.prisma$"), ] -STATE_PATH = os.path.join(".kiro", ".audit_state.json") +STATE_PATH = os.path.join(".kiro", "state", ".audit_state.json") def now_taipei(): @@ -78,7 +78,7 @@ def is_noise(f: str) -> bool: def write_state(state: dict): - os.makedirs(".kiro", exist_ok=True) + os.makedirs(os.path.join(".kiro", "state"), exist_ok=True) with open(STATE_PATH, "w", encoding="utf-8") as fh: json.dump(state, fh, indent=2, ensure_ascii=False) diff --git a/.kiro/scripts/audit_reminder.py b/.kiro/scripts/audit_reminder.py index 0e72724..844bded 100644 --- a/.kiro/scripts/audit_reminder.py +++ b/.kiro/scripts/audit_reminder.py @@ -11,7 +11,7 @@ import sys from datetime import datetime, timezone, timedelta TZ_TAIPEI = timezone(timedelta(hours=8)) -STATE_PATH = os.path.join(".kiro", ".audit_state.json") +STATE_PATH = os.path.join(".kiro", "state", ".audit_state.json") MIN_INTERVAL = timedelta(minutes=15) @@ -30,7 +30,7 @@ def load_state(): def save_state(state): - os.makedirs(".kiro", exist_ok=True) + os.makedirs(os.path.join(".kiro", "state"), exist_ok=True) with open(STATE_PATH, "w", encoding="utf-8") as f: json.dump(state, f, indent=2, ensure_ascii=False) diff --git a/.kiro/scripts/build_audit_context.py b/.kiro/scripts/build_audit_context.py index 2b89ebe..1052326 100644 --- a/.kiro/scripts/build_audit_context.py +++ b/.kiro/scripts/build_audit_context.py @@ -2,13 +2,13 @@ """build_audit_context — 合并所有前置 hook 产出,生成统一审计上下文快照。 读取: -- .kiro/.audit_state.json(audit-flagger 产出:风险判定、变更文件列表) -- .kiro/.compliance_state.json(change-compliance 产出:文档缺失、迁移状态) -- .kiro/.last_prompt_id.json(prompt-audit-log 产出:Prompt ID 溯源) +- .kiro/state/.audit_state.json(audit-flagger 产出:风险判定、变更文件列表) +- .kiro/state/.compliance_state.json(change-compliance 产出:文档缺失、迁移状态) +- .kiro/state/.last_prompt_id.json(prompt-audit-log 产出:Prompt ID 溯源) - git diff --stat HEAD(变更统计摘要) - git diff HEAD(仅高风险文件的 diff,截断到合理长度) -输出:.kiro/.audit_context.json(audit-writer 子代理的唯一输入) +输出:.kiro/state/.audit_context.json(audit-writer 子代理的唯一输入) """ import json @@ -19,7 +19,7 @@ import sys from datetime import datetime, timezone, timedelta TZ_TAIPEI = timezone(timedelta(hours=8)) -CONTEXT_PATH = os.path.join(".kiro", ".audit_context.json") +CONTEXT_PATH = os.path.join(".kiro", "state", ".audit_context.json") # 高风险路径(只对这些文件取 diff,避免 diff 过大) HIGH_RISK_PATTERNS = [ @@ -108,9 +108,9 @@ def main(): now = datetime.now(TZ_TAIPEI) # 读取前置 hook 产出 - audit_state = safe_read_json(os.path.join(".kiro", ".audit_state.json")) - compliance = safe_read_json(os.path.join(".kiro", ".compliance_state.json")) - prompt_id_info = safe_read_json(os.path.join(".kiro", ".last_prompt_id.json")) + audit_state = safe_read_json(os.path.join(".kiro", "state", ".audit_state.json")) + compliance = safe_read_json(os.path.join(".kiro", "state", ".compliance_state.json")) + prompt_id_info = safe_read_json(os.path.join(".kiro", "state", ".last_prompt_id.json")) # 从 audit_state 提取高风险文件 changed_files = audit_state.get("changed_files", []) @@ -156,7 +156,7 @@ def main(): "latest_prompt_log": prompt_log, } - os.makedirs(".kiro", exist_ok=True) + os.makedirs(os.path.join(".kiro", "state"), exist_ok=True) with open(CONTEXT_PATH, "w", encoding="utf-8") as f: json.dump(context, f, indent=2, ensure_ascii=False) diff --git a/.kiro/scripts/change_compliance_prescan.py b/.kiro/scripts/change_compliance_prescan.py index 8dd3075..8fb09fa 100644 --- a/.kiro/scripts/change_compliance_prescan.py +++ b/.kiro/scripts/change_compliance_prescan.py @@ -17,7 +17,7 @@ import sys from datetime import datetime, timezone, timedelta TZ_TAIPEI = timezone(timedelta(hours=8)) -STATE_PATH = os.path.join(".kiro", ".audit_state.json") +STATE_PATH = os.path.join(".kiro", "state", ".audit_state.json") # doc-map 中定义的文档对应关系 DOC_MAP = { @@ -179,12 +179,12 @@ def classify_files(files): return result -COMPLIANCE_STATE_PATH = os.path.join(".kiro", ".compliance_state.json") +COMPLIANCE_STATE_PATH = os.path.join(".kiro", "state", ".compliance_state.json") def save_compliance_state(result, needs_check): """持久化合规检查结果,供 audit-writer 子代理读取""" - os.makedirs(".kiro", exist_ok=True) + os.makedirs(os.path.join(".kiro", "state"), exist_ok=True) now = datetime.now(TZ_TAIPEI) state = { "needs_check": needs_check, diff --git a/.kiro/scripts/file_baseline.py b/.kiro/scripts/file_baseline.py new file mode 100644 index 0000000..c783b45 --- /dev/null +++ b/.kiro/scripts/file_baseline.py @@ -0,0 +1,170 @@ +#!/usr/bin/env python3 +"""file_baseline — 基于文件 mtime+size 的独立基线快照系统。 + +不依赖 git commit 历史,通过扫描工作区文件的 (mtime, size) 指纹, +在 promptSubmit 和 agentStop 之间精确检测"本次对话期间"的文件变更。 + +用法: + from file_baseline import scan_workspace, diff_baselines, save_baseline, load_baseline +""" + +import json +import os +import re +from typing import TypedDict + +BASELINE_PATH = os.path.join(".kiro", "state", ".file_baseline.json") + +# 扫描时排除的目录(与 .gitignore 对齐 + 额外排除) +EXCLUDE_DIRS = { + ".git", ".venv", "venv", "ENV", "env", + "node_modules", "__pycache__", ".hypothesis", ".pytest_cache", + ".idea", ".vscode", ".specstory", + "build", "dist", "eggs", ".eggs", + "export", "reports", "tmp", + "htmlcov", ".coverage", + # Kiro 运行时状态不参与业务变更检测 + ".kiro", +} + +# 扫描时排除的文件后缀 +EXCLUDE_SUFFIXES = { + ".pyc", ".pyo", ".pyd", ".so", ".egg", ".whl", + ".log", ".jsonl", ".lnk", + ".swp", ".swo", +} + +# 扫描时排除的文件名模式 +EXCLUDE_NAMES = { + ".DS_Store", "Thumbs.db", "desktop.ini", +} + +# 业务目录白名单(只扫描这些顶层目录 + 根目录散文件) +# 这样可以避免扫描 .vite/deps 等深层缓存目录 +SCAN_ROOTS = [ + "apps", + "packages", + "db", + "docs", + "scripts", + "tests", +] + + +class FileEntry(TypedDict): + mtime: float + size: int + + +class DiffResult(TypedDict): + added: list[str] + modified: list[str] + deleted: list[str] + + +def _should_exclude_dir(dirname: str) -> bool: + """判断目录是否应排除""" + return dirname in EXCLUDE_DIRS or dirname.startswith(".") + + +def _should_exclude_file(filename: str) -> bool: + """判断文件是否应排除""" + if filename in EXCLUDE_NAMES: + return True + _, ext = os.path.splitext(filename) + if ext.lower() in EXCLUDE_SUFFIXES: + return True + return False + + +def scan_workspace(root: str = ".") -> dict[str, FileEntry]: + """扫描工作区,返回 {相对路径: {mtime, size}} 字典。 + + 只扫描 SCAN_ROOTS 中的目录 + 根目录下的散文件, + 跳过 EXCLUDE_DIRS / EXCLUDE_SUFFIXES / EXCLUDE_NAMES。 + """ + result: dict[str, FileEntry] = {} + + # 1. 根目录散文件(pyproject.toml, .env 等) + try: + for entry in os.scandir(root): + if entry.is_file(follow_symlinks=False): + if _should_exclude_file(entry.name): + continue + try: + st = entry.stat(follow_symlinks=False) + rel = entry.name.replace("\\", "/") + result[rel] = {"mtime": st.st_mtime, "size": st.st_size} + except OSError: + continue + except OSError: + pass + + # 2. 业务目录递归扫描 + for scan_root in SCAN_ROOTS: + top = os.path.join(root, scan_root) + if not os.path.isdir(top): + continue + for dirpath, dirnames, filenames in os.walk(top): + # 原地修改 dirnames 以跳过排除目录 + dirnames[:] = [ + d for d in dirnames + if not _should_exclude_dir(d) + ] + for fname in filenames: + if _should_exclude_file(fname): + continue + full = os.path.join(dirpath, fname) + try: + st = os.stat(full) + rel = os.path.relpath(full, root).replace("\\", "/") + result[rel] = {"mtime": st.st_mtime, "size": st.st_size} + except OSError: + continue + + return result + + +def diff_baselines( + before: dict[str, FileEntry], + after: dict[str, FileEntry], +) -> DiffResult: + """对比两次快照,返回 added/modified/deleted 列表。""" + before_keys = set(before.keys()) + after_keys = set(after.keys()) + + added = sorted(after_keys - before_keys) + deleted = sorted(before_keys - after_keys) + + modified = [] + for path in sorted(before_keys & after_keys): + b = before[path] + a = after[path] + # mtime 或 size 任一变化即视为修改 + if b["mtime"] != a["mtime"] or b["size"] != a["size"]: + modified.append(path) + + return {"added": added, "modified": modified, "deleted": deleted} + + +def save_baseline(data: dict[str, FileEntry], path: str = BASELINE_PATH): + """保存基线快照到 JSON 文件。""" + os.makedirs(os.path.dirname(path) or ".kiro", exist_ok=True) + with open(path, "w", encoding="utf-8") as f: + json.dump(data, f, ensure_ascii=False) + + +def load_baseline(path: str = BASELINE_PATH) -> dict[str, FileEntry]: + """加载基线快照,文件不存在返回空字典。""" + if not os.path.isfile(path): + return {} + try: + with open(path, "r", encoding="utf-8") as f: + return json.load(f) + except Exception: + return {} + + +def total_changes(diff: DiffResult) -> int: + """变更文件总数""" + return len(diff["added"]) + len(diff["modified"]) + len(diff["deleted"]) diff --git a/.kiro/scripts/prompt_audit_log.py b/.kiro/scripts/prompt_audit_log.py index 3bd214b..83ac330 100644 --- a/.kiro/scripts/prompt_audit_log.py +++ b/.kiro/scripts/prompt_audit_log.py @@ -46,9 +46,9 @@ def main(): f.write(entry) # 保存 last prompt id 供 /audit 溯源 - os.makedirs(".kiro", exist_ok=True) + os.makedirs(os.path.join(".kiro", "state"), exist_ok=True) last_prompt = {"prompt_id": prompt_id, "at": now.isoformat()} - with open(os.path.join(".kiro", ".last_prompt_id.json"), "w", encoding="utf-8") as f: + with open(os.path.join(".kiro", "state", ".last_prompt_id.json"), "w", encoding="utf-8") as f: json.dump(last_prompt, f, indent=2, ensure_ascii=False) diff --git a/.kiro/scripts/prompt_on_submit.py b/.kiro/scripts/prompt_on_submit.py index 3c086ee..f31e73c 100644 --- a/.kiro/scripts/prompt_on_submit.py +++ b/.kiro/scripts/prompt_on_submit.py @@ -1,11 +1,13 @@ #!/usr/bin/env python3 -"""prompt_on_submit — promptSubmit 合并 hook 脚本。 +"""prompt_on_submit — promptSubmit 合并 hook 脚本(v2:文件基线模式)。 合并原 audit_flagger + prompt_audit_log 的功能: -1. git status → 风险判定 → 写 .kiro/.audit_state.json -2. 记录 prompt 日志 → docs/audit/prompt_logs/ -3. 记录当前 git fingerprint 快照 → .kiro/.git_snapshot.json(供 agentStop 对比) +1. 扫描工作区文件 → 保存基线快照 → .kiro/state/.file_baseline.json +2. 基于基线文件列表做风险判定 → .kiro/state/.audit_state.json +3. 记录 prompt 日志 → docs/audit/prompt_logs/ +变更检测不再依赖 git status(解决不常 commit 导致的误判问题)。 +风险判定仍基于 git status(因为需要知道哪些文件相对于 commit 有变化)。 所有功能块用 try/except 隔离,单个失败不影响其他。 """ @@ -17,6 +19,11 @@ import subprocess import sys from datetime import datetime, timezone, timedelta +# 同目录导入文件基线模块 + cwd 校验 +sys.path.insert(0, os.path.dirname(os.path.abspath(__file__))) +from file_baseline import scan_workspace, save_baseline +from _ensure_root import ensure_repo_root + TZ_TAIPEI = timezone(timedelta(hours=8)) # ── 风险规则(来自 audit_flagger) ── @@ -43,9 +50,8 @@ DB_PATTERNS = [ re.compile(r"\.prisma$"), ] -STATE_PATH = os.path.join(".kiro", ".audit_state.json") -SNAPSHOT_PATH = os.path.join(".kiro", ".git_snapshot.json") -PROMPT_ID_PATH = os.path.join(".kiro", ".last_prompt_id.json") +STATE_PATH = os.path.join(".kiro", "state", ".audit_state.json") +PROMPT_ID_PATH = os.path.join(".kiro", "state", ".last_prompt_id.json") def now_taipei(): @@ -56,7 +62,8 @@ def sha1hex(s: str) -> str: return hashlib.sha1(s.encode("utf-8")).hexdigest() -def get_changed_files() -> list[str]: +def get_git_changed_files() -> list[str]: + """通过 git status 获取变更文件(仅用于风险判定,不用于变更检测)""" try: r = subprocess.run( ["git", "status", "--porcelain"], @@ -94,14 +101,14 @@ def safe_read_json(path): def write_json(path, data): - os.makedirs(os.path.dirname(path) or ".kiro", exist_ok=True) + os.makedirs(os.path.dirname(path) or os.path.join(".kiro", "state"), exist_ok=True) with open(path, "w", encoding="utf-8") as f: json.dump(data, f, indent=2, ensure_ascii=False) -# ── 功能块 1:风险标记(audit_flagger) ── -def do_audit_flag(all_files, now): - files = sorted(set(f for f in all_files if not is_noise(f))) +# ── 功能块 1:风险标记(基于 git status,判定哪些文件需要审计) ── +def do_audit_flag(git_files, now): + files = sorted(set(f for f in git_files if not is_noise(f))) if not files: write_json(STATE_PATH, { @@ -184,47 +191,38 @@ def do_prompt_log(now): write_json(PROMPT_ID_PATH, {"prompt_id": prompt_id, "at": now.isoformat()}) -# ── 功能块 3:Git 快照(供 agentStop 对比检测非 Kiro 变更) ── -def do_git_snapshot(all_files, now): - fp = sha1hex("\n".join(sorted(all_files))) if all_files else "" - write_json(SNAPSHOT_PATH, { - "files": sorted(all_files)[:100], - "fingerprint": fp, - "taken_at": now.isoformat(), - }) +# ── 功能块 3:文件基线快照(替代 git snapshot) ── +def do_file_baseline(): + """扫描工作区文件 mtime+size,保存为基线快照。 + agentStop 时再扫一次对比,即可精确检测本次对话期间的变更。 + """ + baseline = scan_workspace(".") + save_baseline(baseline) def main(): - # 非 git 仓库直接退出 - try: - r = subprocess.run( - ["git", "rev-parse", "--is-inside-work-tree"], - capture_output=True, text=True, encoding="utf-8", errors="replace", timeout=5 - ) - if r.returncode != 0: - return - except Exception: - return - + ensure_repo_root() now = now_taipei() - all_files = get_changed_files() - # 各功能块独立 try/except + # 功能块 3:文件基线快照(最先执行,记录对话开始时的文件状态) try: - do_audit_flag(all_files, now) + do_file_baseline() except Exception: pass + # 功能块 1:风险标记(仍用 git status,因为需要知道未提交的变更) + try: + git_files = get_git_changed_files() + do_audit_flag(git_files, now) + except Exception: + pass + + # 功能块 2:Prompt 日志 try: do_prompt_log(now) except Exception: pass - try: - do_git_snapshot(all_files, now) - except Exception: - pass - if __name__ == "__main__": try: diff --git a/.kiro/scripts/session_log.py b/.kiro/scripts/session_log.py index 82c6201..91dbf79 100644 --- a/.kiro/scripts/session_log.py +++ b/.kiro/scripts/session_log.py @@ -4,8 +4,8 @@ 收集来源: - 环境变量 AGENT_OUTPUT(Kiro 注入的 agent 输出) - 环境变量 USER_PROMPT(最近一次用户输入) -- .kiro/.last_prompt_id.json(Prompt ID 溯源) -- .kiro/.audit_state.json(变更文件列表) +- .kiro/state/.last_prompt_id.json(Prompt ID 溯源) +- .kiro/state/.audit_state.json(变更文件列表) - git diff --stat(变更统计) 输出:docs/audit/session_logs/session_.md @@ -17,10 +17,14 @@ import subprocess import sys from datetime import datetime, timezone, timedelta +# cwd 校验 +sys.path.insert(0, os.path.dirname(os.path.abspath(__file__))) +from _ensure_root import ensure_repo_root + TZ_TAIPEI = timezone(timedelta(hours=8)) LOG_DIR = os.path.join("docs", "audit", "session_logs") -STATE_PATH = os.path.join(".kiro", ".audit_state.json") -PROMPT_ID_PATH = os.path.join(".kiro", ".last_prompt_id.json") +STATE_PATH = os.path.join(".kiro", "state", ".audit_state.json") +PROMPT_ID_PATH = os.path.join(".kiro", "state", ".last_prompt_id.json") def now_taipei(): @@ -60,6 +64,7 @@ def git_status_short(): def main(): + ensure_repo_root() now = now_taipei() ts = now.strftime("%Y%m%d_%H%M%S") timestamp_display = now.strftime("%Y-%m-%d %H:%M:%S %z") diff --git a/.kiro/settings/mcp.json b/.kiro/settings/mcp.json index a7fb969..9da43b1 100644 --- a/.kiro/settings/mcp.json +++ b/.kiro/settings/mcp.json @@ -1,5 +1,23 @@ { "mcpServers": { + "image-compare": { + "command": "npx", + "args": ["-y", "mcp-image-compare-server"], + "disabled": false, + "autoApprove": [ "*", + "all"] + }, + "weixin-devtools-mcp": { + "command": "npx", + "args": ["-y", "weixin-devtools-mcp", "--tools-profile=full", "--ws-endpoint=ws://127.0.0.1:9420"], + "env": { + "WECHAT_DEVTOOLS_CLI": "C:\\dev\\WechatDevtools\\cli.bat", + "WECHAT_DEVTOOLS_PROJECT": "C:\\NeoZQYY\\apps\\miniprogram" + }, + "disabled": false, + "autoApprove": [ "*", + "all"] + }, "git": { "command": "uvx", "args": [ @@ -7,7 +25,7 @@ "--repository", "C:\\NeoZQYY" ], - "disabled": false, + "disabled": true, "autoApprove": [ "all", "*" @@ -40,7 +58,7 @@ "env": { "DATABASE_URI": "postgresql://local-Python:Neo-local-1991125@100.64.0.4:5432/test_etl_feiqiu" }, - "disabled": false, + "disabled": true, "autoApprove": [ "all", "*" @@ -70,7 +88,7 @@ "env": { "DATABASE_URI": "postgresql://local-Python:Neo-local-1991125@100.64.0.4:5432/test_zqyy_app" }, - "disabled": false, + "disabled": true, "autoApprove": [ "all", "*" diff --git a/.kiro/skills/bd-manual-db-docs/assets/schema-changelog-template.md b/.kiro/skills/bd-manual-db-docs/assets/schema-changelog-template.md index a54221f..bf94d10 100644 --- a/.kiro/skills/bd-manual-db-docs/assets/schema-changelog-template.md +++ b/.kiro/skills/bd-manual-db-docs/assets/schema-changelog-template.md @@ -1,6 +1,6 @@ # Schema 变更日志(Schema Change Log) -- 日期(Asia/Shanghai,YYYY-MM-DD): +- 日期(Asia/Shanghai,YYYY-MM-DD HH:MM:SS,精确到秒): - Prompt-ID: - 原始原因(Prompt 摘录/原文): - 直接原因(必要性 + 方案简介): diff --git a/.kiro/skills/bd-manual-db-docs/assets/table-structure-template.md b/.kiro/skills/bd-manual-db-docs/assets/table-structure-template.md index 3da28c4..9496afd 100644 --- a/.kiro/skills/bd-manual-db-docs/assets/table-structure-template.md +++ b/.kiro/skills/bd-manual-db-docs/assets/table-structure-template.md @@ -19,4 +19,4 @@ - 例如:状态机枚举范围、唯一性、跨字段一致性约束(如有) ## 变更历史(Change History) -- YYYY-MM-DD | Prompt-ID | 直接原因 | 变更摘要 +- YYYY-MM-DD HH:MM:SS | Prompt-ID | 直接原因 | 变更摘要 diff --git a/.kiro/skills/change-annotation-audit/assets/audit-record-template.md b/.kiro/skills/change-annotation-audit/assets/audit-record-template.md index 7dd436a..f9ea51f 100644 --- a/.kiro/skills/change-annotation-audit/assets/audit-record-template.md +++ b/.kiro/skills/change-annotation-audit/assets/audit-record-template.md @@ -1,6 +1,6 @@ # 变更审计记录(Change Audit Record) -- 日期/时间(Asia/Shanghai): +- 日期/时间(Asia/Shanghai,精确到秒,格式 YYYY-MM-DD HH:MM:SS): - Prompt-ID: - 原始原因(Prompt 原文或 ≤5 行摘录): - 直接原因(必要性 + 修改方案简介): diff --git a/.kiro/skills/change-annotation-audit/assets/file-changelog-templates.md b/.kiro/skills/change-annotation-audit/assets/file-changelog-templates.md index 84a8ec9..27d5b97 100644 --- a/.kiro/skills/change-annotation-audit/assets/file-changelog-templates.md +++ b/.kiro/skills/change-annotation-audit/assets/file-changelog-templates.md @@ -1,20 +1,22 @@ # 文件内 AI_CHANGELOG 与 CHANGE 标记模板 -## 通用 AI_CHANGELOG(建议放在文件头部或“变更记录”小节) -- 2026-02-13 | Prompt: P20260213-101530(摘录:...)| Direct cause:... | Summary:... | Verify:... +> 所有时间戳精确到秒,格式:`YYYY-MM-DD HH:MM:SS`,时区 Asia/Shanghai。 + +## 通用 AI_CHANGELOG(建议放在文件头部或"变更记录"小节) +- 2026-02-13 10:15:30 | Prompt: P20260213-101530(摘录:...)| Direct cause:... | Summary:... | Verify:... --- -## Markdown / 文档(放在文档末尾或“变更记录”小节) +## Markdown / 文档(放在文档末尾或"变更记录"小节) ### AI_CHANGELOG -- YYYY-MM-DD | Prompt: P...(摘录:...)| Direct cause:... | Summary:... | Verify:... +- YYYY-MM-DD HH:MM:SS | Prompt: P...(摘录:...)| Direct cause:... | Summary:... | Verify:... --- ## JS/TS(块注释) /* AI_CHANGELOG -- YYYY-MM-DD | Prompt: P...(摘录:...)| Direct cause:... | Summary:... | Verify:... +- YYYY-MM-DD HH:MM:SS | Prompt: P...(摘录:...)| Direct cause:... | Summary:... | Verify:... */ // [CHANGE P...] intent: ... @@ -27,7 +29,7 @@ AI_CHANGELOG ## Python(docstring/块注释) """ AI_CHANGELOG -- YYYY-MM-DD | Prompt: P...(摘录:...)| Direct cause:... | Summary:... | Verify:... +- YYYY-MM-DD HH:MM:SS | Prompt: P...(摘录:...)| Direct cause:... | Summary:... | Verify:... """ # [CHANGE P...] intent: ... @@ -40,7 +42,7 @@ AI_CHANGELOG ## SQL(块注释 + 行注释) /* AI_CHANGELOG -- YYYY-MM-DD | Prompt: P...(摘录:...)| Direct cause:... | Summary:... | Verify:... +- YYYY-MM-DD HH:MM:SS | Prompt: P...(摘录:...)| Direct cause:... | Summary:... | Verify:... */ -- [CHANGE P...] intent: ... -- assumptions: ... diff --git a/.kiro/specs/02-etl-dws-miniapp-extensions/design.md b/.kiro/specs/02-etl-dws-miniapp-extensions/design.md index 7bf15d7..1caf8cb 100644 --- a/.kiro/specs/02-etl-dws-miniapp-extensions/design.md +++ b/.kiro/specs/02-etl-dws-miniapp-extensions/design.md @@ -12,7 +12,7 @@ ### 设计决策 -1. **助教订单流水独立建表**:四项统计粒度为 `(site_id, assistant_id, stat_date)`,与现有 `dws_assistant_daily_detail` 粒度相同但语义不同(daily_detail 聚焦服务时长/金额,contribution 聚焦订单级流水分摊),独立建表避免字段膨胀。 +1. **助教订单流水独立建表**:四项统计粒度为 `(site_id, assistant_id, stat_date)`,与现有 `dws_assistant_daily_detail` 粒度相同但语义不同(daily_detail 聚焦服务时长/金额,contribution 聚焦订单级流水分摊),独立建表避免字段膨胀。`stat_date` 为营业日(以 `BUSINESS_DAY_START_HOUR` 08:00 为日切点)。 2. **时效贡献流水计算为纯函数**:核心分摊算法(`compute_time_weighted_revenue`)设计为静态方法,输入为结构化的订单数据,输出为每名助教的贡献值。不依赖数据库,便于属性测试。 3. **惩罚检测在 transform 阶段完成**:定档折算惩罚的时间重叠检测和计算在 `AssistantDailyTask.transform` 中完成,不新建独立任务,因为惩罚字段与日度明细同粒度。 4. **充值统计复用现有 extract 模式**:在 `MemberConsumptionTask` 中新增一个 `_extract_recharge_stats` 方法,与现有的 `_extract_consumption_stats` 并行提取,在 transform 阶段合并。 diff --git a/.kiro/specs/03-miniapp-auth-system/tasks.md b/.kiro/specs/03-miniapp-auth-system/tasks.md index 2996846..ef79b04 100644 --- a/.kiro/specs/03-miniapp-auth-system/tasks.md +++ b/.kiro/specs/03-miniapp-auth-system/tasks.md @@ -134,17 +134,17 @@ - **Property 9: 非 pending 申请审核拒绝** - **Validates: Requirements 6.6** -- [ ] 6. 检查点 - 确保所有测试通过 +- [x] 6. 检查点 - 确保所有测试通过 - 确保所有测试通过,如有问题请向用户确认。 -- [ ] 7. 实现权限中间件和管理端路由 - - [ ] 7.1 创建权限中间件 `apps/backend/app/middleware/permission.py` +- [x] 7. 实现权限中间件和管理端路由 + - [x] 7.1 创建权限中间件 `apps/backend/app/middleware/permission.py` - 实现 `require_permission(*permission_codes)` 依赖 - 实现 `require_approved()` 依赖 - 检查用户 status + 权限列表 - _Requirements: 9.1-9.4_ - - [ ] 7.2 创建管理端审核路由 `apps/backend/app/routers/admin_applications.py` + - [x] 7.2 创建管理端审核路由 `apps/backend/app/routers/admin_applications.py` - 实现 `GET /api/admin/applications`:查询申请列表 - 实现 `GET /api/admin/applications/{id}`:查询申请详情 + 候选匹配 - 实现 `POST /api/admin/applications/{id}/approve`:批准申请 @@ -152,51 +152,51 @@ - 在 `apps/backend/app/main.py` 中注册路由 - _Requirements: 6.1-6.6, 5.1-5.6_ - - [ ] 7.3 编写权限中间件拦截属性测试 + - [x] 7.3 编写权限中间件拦截属性测试 - **Property 13: 权限中间件拦截正确性** - **Validates: Requirements 8.3, 9.1, 9.2, 9.3** - - [ ] 7.4 编写多店铺角色独立分配属性测试 + - [x] 7.4 编写多店铺角色独立分配属性测试 - **Property 11: 多店铺角色独立分配** - **Validates: Requirements 8.1** - - [ ] 7.5 编写店铺切换令牌属性测试 + - [x] 7.5 编写店铺切换令牌属性测试 - **Property 12: 店铺切换令牌正确性** - **Validates: Requirements 8.2, 10.4** -- [ ] 8. 集成与端到端验证 - - [ ] 8.1 更新 `apps/backend/app/config.py` 新增微信配置项 +- [x] 8. 集成与端到端验证 + - [x] 8.1 更新 `apps/backend/app/config.py` 新增微信配置项 - 新增 `WX_APPID`、`WX_SECRET`、`WX_DEV_MODE` 配置读取 - _Requirements: 3.1, 14.3_ - - [ ] 8.2 更新 `apps/backend/app/main.py` 注册所有新路由 + - [x] 8.2 更新 `apps/backend/app/main.py` 注册所有新路由 - 确保 `xcx_auth` 和 `admin_applications` 路由已注册 - 验证无路由冲突 - _Requirements: 全部_ - - [ ] 8.3 实现开发模式 mock 登录端点 + - [x] 8.3 实现开发模式 mock 登录端点 - 在 `routers/xcx_auth.py` 中新增 `POST /api/xcx/dev-login` - 仅在 `WX_DEV_MODE=true` 时注册 - 接受 `openid` 和可选 `status` 参数,直接查找/创建用户并返回 JWT - _Requirements: 14.2, 14.3_ - - [ ] 8.4 编写用户状态查询完整性属性测试 + - [x] 8.4 编写用户状态查询完整性属性测试 - **Property 10: 用户状态查询完整性** - **Validates: Requirements 7.1, 7.2** - - [ ] 8.5 编写 disabled 用户登录拒绝属性测试 + - [x] 8.5 编写 disabled 用户登录拒绝属性测试 - **Property 3: disabled 用户登录拒绝** - **Validates: Requirements 3.5** -- [ ] 9. 小程序认证前端页面 - - [ ] 9.1 实现请求封装工具 `apps/miniprogram/miniprogram/utils/request.ts` +- [x] 9. 小程序认证前端页面 + - [x] 9.1 实现请求封装工具 `apps/miniprogram/miniprogram/utils/request.ts` - 统一请求封装:自动附加 Authorization header - 401 时自动尝试 refresh_token 刷新 - 刷新失败时跳转 login 页面 - 后端 base URL 从配置读取(开发环境 `http://localhost:8000`) - _Requirements: 13.8_ - - [ ] 9.2 实现登录页 `apps/miniprogram/miniprogram/pages/login/` + - [x] 9.2 实现登录页 `apps/miniprogram/miniprogram/pages/login/` - 调用 `wx.login()` 获取 code - 发送 code 到 `POST /api/xcx/login` - 根据返回的 `user_status` 路由到对应页面 @@ -204,7 +204,7 @@ - 参考 H5 原型 `docs/h5_ui/pages/login.html` - _Requirements: 13.1, 13.6, 13.7, 13.8_ - - [ ] 9.3 实现申请表单页 `apps/miniprogram/miniprogram/pages/apply/` + - [x] 9.3 实现申请表单页 `apps/miniprogram/miniprogram/pages/apply/` - 表单字段:球房ID(site_code)、申请身份、手机号、编号(选填)、昵称 - 前端校验:site_code 格式(2字母+3数字)、手机号(11位数字) - 提交到 `POST /api/xcx/apply` @@ -212,7 +212,7 @@ - 参考 H5 原型 `docs/h5_ui/pages/apply.html` - _Requirements: 13.2, 13.3_ - - [ ] 9.4 实现审核等待页 `apps/miniprogram/miniprogram/pages/reviewing/` + - [x] 9.4 实现审核等待页 `apps/miniprogram/miniprogram/pages/reviewing/` - 显示当前申请状态(审核中/已拒绝) - 显示申请信息摘要(球房ID、申请身份、手机号) - 拒绝时显示拒绝原因 + "重新申请"按钮 @@ -220,42 +220,68 @@ - 参考 H5 原型 `docs/h5_ui/pages/reviewing.html` - _Requirements: 13.4, 13.5_ - - [ ] 9.5 实现无权限页 `apps/miniprogram/miniprogram/pages/no-permission/` + - [x] 9.5 实现无权限页 `apps/miniprogram/miniprogram/pages/no-permission/` - 显示账号已禁用提示 - 参考 H5 原型 `docs/h5_ui/pages/no-permission.html` - _Requirements: 13.7_ - - [ ] 9.6 更新 `app.ts` 和 `app.json` + - [x] 9.6 更新 `app.ts` 和 `app.json` - 在 `app.json` 中注册新页面(login、apply、reviewing、no-permission) - 在 `app.ts` 的 `onLaunch` 中实现自动登录逻辑 - 根据用户状态路由到对应页面 - 扩展 globalData 类型定义(token、userInfo、currentSiteId、sites) - _Requirements: 13.8_ -- [ ] 10. 前后端联调验证 - - [ ] 10.1 编写联调指南文档 `apps/miniprogram/doc/auth-integration-guide.md` +- [x] 10. 前后端联调验证 + - [x] 10.1 编写联调指南文档 `apps/miniprogram/doc/auth-integration-guide.md` - 微信开发者工具项目导入配置说明 - 后端启动步骤(含 `WX_DEV_MODE=true` 配置) - 测试流程:mock 登录 → 申请 → 管理端审核 → 重新登录验证 - 常见问题排查 - _Requirements: 14.1, 14.4_ - - [ ] 10.2 在微信开发者工具中执行联调验证 + - [x] 10.2 在微信开发者工具中执行联调验证 - 验证登录流程:wx.login → 后端 → JWT 返回 - 验证申请流程:表单提交 → 后端创建申请 → 审核等待页展示 - 验证状态路由:pending/approved/rejected/disabled 各状态正确跳转 - 验证 token 刷新:access_token 过期后自动刷新 - _Requirements: 14.1_ -- [ ] 11. 最终检查点 - 确保所有测试通过 - - 确保所有测试通过,如有问题请向用户确认。 +- [x] 11. 属性测试全量运行(100 次迭代)— ✅ 15/15 全部通过 + - 前面各任务中的属性测试仅用 5 次迭代快速验证逻辑正确性 + - 本任务集中对所有属性测试执行 100 次迭代,确保健壮性 + - 运行脚本:`scripts/ops/_run_auth_pbt_full.py` + - 结果报告:`export/reports/auth_pbt_full_20260227_034401.md` + - 总耗时 375s,15 个属性测试全部通过(100 次迭代/每个) + - [x] 11.1 P1 迁移脚本幂等性 — ✅ 25.0s + - [x] 11.2 P2 登录创建/查找用户 — ✅ 49.8s + - [x] 11.3 P3 disabled 用户登录拒绝 — ✅ 37.9s + - [x] 11.4 P4 申请创建正确性 — ✅ 21.3s + - [x] 11.5 P5 手机号格式验证 — ✅ 2.5s + - [x] 11.6 P6 重复申请拒绝 — ✅ 24.7s + - [x] 11.7 P7 人员匹配合并正确性 — ✅ 14.9s + - [x] 11.8 P8 审核操作正确性 — ✅ 22.7s + - [x] 11.9 P9 非 pending 审核拒绝 — ✅ 18.8s + - [x] 11.10 P10 用户状态查询完整性 — ✅ 28.6s + - [x] 11.11 P11 多店铺角色独立分配 — ✅ 46.6s + - [x] 11.12 P12 店铺切换令牌正确性 — ✅ 45.3s + - [x] 11.13 P13 权限中间件拦截正确性 — ✅ 11.9s + - [x] 11.14 P14 JWT payload 结构一致性 — ✅ 4.6s + - [x] 11.15 P15 JWT 过期/无效拒绝 — ✅ 3.2s + +- [x] 12. 最终检查点 + - 任务 1-12 全部完成 + - 15 个属性测试在 100 次迭代下全部通过(报告见 `export/reports/auth_pbt_full_20260227_034401.md`) + - 小程序 4 个认证页面(login/apply/reviewing/no-permission)已创建 + - app.ts / app.json 已更新为认证感知版本 + - 联调指南文档已编写 ## 备注 - 标记 `*` 的任务为可选,可跳过以加速 MVP - 每个任务引用了具体的需求编号,确保可追溯 - 检查点确保增量验证 -- 属性测试验证通用正确性属性(hypothesis,最少 100 次迭代) +- **属性测试策略**:开发阶段各任务中属性测试用 5 次迭代快速验证;任务 11 集中用 100 次迭代全量运行,逐个报告进度 - 单元测试验证具体例子和边界情况 - 所有数据库操作在测试库 `test_zqyy_app` 进行 - 迁移脚本放在 `db/zqyy_app/migrations/` 目录 diff --git a/.kiro/specs/04-miniapp-core-business/.config.kiro b/.kiro/specs/04-miniapp-core-business/.config.kiro new file mode 100644 index 0000000..4c04556 --- /dev/null +++ b/.kiro/specs/04-miniapp-core-business/.config.kiro @@ -0,0 +1 @@ +{"specId": "27029642-a405-4932-8c22-5bc54fad5173", "workflowType": "requirements-first", "specType": "feature"} \ No newline at end of file diff --git a/.kiro/specs/04-miniapp-core-business/design.md b/.kiro/specs/04-miniapp-core-business/design.md new file mode 100644 index 0000000..d0ef929 --- /dev/null +++ b/.kiro/specs/04-miniapp-core-business/design.md @@ -0,0 +1,1115 @@ +# 设计文档:小程序核心业务模块(miniapp-core-business) + +## 概述 + +本设计在 P1(miniapp-db-foundation)、P2(etl-dws-miniapp-extensions)、P3(miniapp-auth-system)基础上,实现小程序的核心业务逻辑: + +1. **助教任务系统**:基于 WBI/NCI/RS 指数自动生成 4 种类型任务,支持状态流转(active → inactive/completed/abandoned)、类型变更追溯、48 小时回访滞留机制 +2. **备注系统**:统一备注 CRUD,支持普通备注/回访备注/放弃原因三种类型,含星星评分(再次服务意愿 + 再来店可能性,各 1-5) +3. **召回完成检测与备注回溯**:ETL 数据更新后自动检测助教服务记录,匹配活跃任务标记完成,回溯重分类普通备注为回访备注 +4. **触发器调度框架**:统一的 cron/interval/event 三种触发方式调度引擎,驱动任务生成、有效期检查、召回检测、备注回溯 + +**环境变量依赖**: + +| 环境变量 | 用途 | 来源 | +|---------|------|------| +| `APP_DB_DSN` / `DB_HOST` 等 | 业务库连接(`test_zqyy_app`) | 根 `.env` | +| `PG_DSN` / `ETL_DB_HOST` 等 | ETL 库连接(FDW 读取指数) | 根 `.env` | +| `JWT_SECRET_KEY` | JWT 签名密钥 | `.env.local` | + +**整体数据流向**: + +``` +ETL 库(test_etl_feiqiu) 业务库(test_zqyy_app) +┌──────────────────────────┐ ┌──────────────────────────────┐ +│ dws.dws_member_winback_ │ │ auth Schema │ +│ index (WBI) │ │ └ user_assistant_binding │ +│ dws.dws_member_newconv_ │ │ │ +│ index (NCI) │ FDW 读取 │ biz Schema │ +│ dws.dws_member_assistant_│ ◄────────────► │ ├ coach_tasks │ +│ relation_index (RS) │ │ ├ coach_task_history │ +│ dwd.dwd_assistant_ │ │ ├ notes │ +│ service_log │ │ └ trigger_jobs │ +│ │ │ │ +│ app Schema (RLS 视图) │ │ fdw_etl Schema (外部表) │ +│ └ v_dws_*, v_dwd_* │ │ └ v_dws_*, v_dwd_* │ +└──────────────────────────┘ │ │ + │ public Schema │ + │ └ member_retention_clue │ + └──────────────────────────────┘ +``` + +## 架构 + +### 分层架构 + +```mermaid +graph TB + subgraph "小程序端" + MP["微信小程序
任务列表 / 备注 / 评分"] + end + + subgraph "FastAPI 后端(apps/backend/)" + subgraph "路由层" + XCX_TASK["routers/xcx_tasks.py
任务列表 / 置顶 / 放弃"] + XCX_NOTE["routers/xcx_notes.py
备注 CRUD / 星星评分"] + end + + subgraph "中间件层" + PERM_MW["middleware/permission.py
require_approved()"] + end + + subgraph "服务层" + TASK_GEN["services/task_generator.py
任务生成器"] + TASK_MGR["services/task_manager.py
任务 CRUD + 状态流转"] + EXPIRY["services/task_expiry.py
有效期轮询"] + RECALL["services/recall_detector.py
召回完成检测"] + RECLASS["services/note_reclassifier.py
备注回溯重分类"] + NOTE_SVC["services/note_service.py
备注 CRUD"] + TRIGGER["services/trigger_scheduler.py
触发器调度框架"] + end + + DB["database.py
get_connection() / get_etl_readonly_connection()"] + end + + subgraph "数据库" + BIZ["biz Schema
coach_tasks / notes / trigger_jobs"] + AUTH["auth Schema
user_assistant_binding"] + FDW["fdw_etl Schema
v_dws_* / v_dwd_*(只读)"] + PUB["public Schema
member_retention_clue"] + end + + MP --> XCX_TASK + MP --> XCX_NOTE + XCX_TASK --> PERM_MW + XCX_NOTE --> PERM_MW + PERM_MW --> TASK_MGR + PERM_MW --> NOTE_SVC + TRIGGER --> TASK_GEN + TRIGGER --> EXPIRY + TRIGGER --> RECALL + RECALL --> RECLASS + TASK_GEN --> DB + TASK_MGR --> DB + EXPIRY --> DB + RECALL --> DB + RECLASS --> DB + NOTE_SVC --> DB + DB --> BIZ + DB --> AUTH + DB --> FDW + DB --> PUB +``` + +### 触发器调度流程 + +```mermaid +sequenceDiagram + participant TS as TriggerScheduler + participant DB as PostgreSQL + participant TG as TaskGenerator + participant EC as TaskExpiryChecker + participant RD as RecallDetector + participant NR as NoteReclassifier + + Note over TS: 每 30 秒轮询 trigger_jobs + + TS->>DB: SELECT * FROM biz.trigger_jobs
WHERE status='enabled' AND next_run_at <= NOW() + + alt cron: task_generator(每日 04:00) + TS->>TG: run() + TG->>DB: 读取 FDW 指数数据(WBI/NCI/RS) + TG->>DB: 读取 auth.user_assistant_binding + TG->>DB: INSERT/UPDATE biz.coach_tasks + TG->>DB: INSERT biz.coach_task_history + TG->>DB: UPDATE trigger_jobs.last_run_at + end + + alt interval: task_expiry_check(每小时) + TS->>EC: run() + EC->>DB: SELECT coach_tasks WHERE expires_at < NOW() + EC->>DB: UPDATE status='inactive' + EC->>DB: UPDATE trigger_jobs.last_run_at + end + + alt event: recall_completion_check(ETL 更新后) + TS->>RD: run() + RD->>DB: 读取 FDW dwd_assistant_service_log + RD->>DB: 匹配 active 任务 → completed + RD->>DB: fire_event('recall_completed', payload) + end + + alt event: note_reclassify_backfill(召回完成时) + TS->>NR: run(payload) + NR->>DB: 查找 normal 备注 → 更新为 follow_up + NR->>DB: 触发 AI 应用 6 接口(P5 实现) + end +``` + +### 任务状态机 + +```mermaid +stateDiagram-v2 + [*] --> active: 任务生成器创建 + active --> inactive: 类型变更(旧任务关闭) + active --> inactive: expires_at 到期(轮询检查) + active --> completed: 召回完成检测 + active --> abandoned: 助教放弃(需填原因) + abandoned --> active: 助教取消放弃 + active --> active: 置顶/取消置顶(is_pinned 变更) + + note right of inactive: expires_at 机制:\n生成时 NULL(无限期)\n条件不满足时填充 created_at+48h\n轮询检查超期后标记 inactive +``` + + +## 组件与接口 + +### 组件 1:触发器调度框架(services/trigger_scheduler.py) + +**职责**:统一管理 cron/interval/event 三种触发方式,驱动后台任务执行。 + +**设计决策**: +- 复用现有 `Scheduler` 的轮询模式(每 30 秒检查 `trigger_jobs` 表),但 `trigger_jobs` 是独立于 `scheduled_tasks` 的新表,专门服务于业务触发器 +- cron/interval 类型通过轮询 `next_run_at` 触发;event 类型通过 `fire_event()` 方法直接触发 +- 每个 job 对应一个 Python 可调用对象,通过 `job_type` 映射到具体的服务方法 + +```python +import logging +from datetime import datetime, timezone +from typing import Any, Callable + +from app.database import get_connection + +logger = logging.getLogger(__name__) + +# job_type → 执行函数的注册表 +_JOB_REGISTRY: dict[str, Callable] = {} + + +def register_job(job_type: str, handler: Callable) -> None: + """注册 job_type 对应的执行函数。""" + _JOB_REGISTRY[job_type] = handler + + +def fire_event(event_name: str, payload: dict[str, Any] | None = None) -> int: + """ + 触发事件驱动型任务。 + + 查找 trigger_condition='event' 且 trigger_config.event_name 匹配的 enabled job, + 立即执行对应的 handler。 + + 返回: 执行的 job 数量 + """ + conn = get_connection() + executed = 0 + try: + with conn.cursor() as cur: + cur.execute( + """ + SELECT id, job_type, job_name + FROM biz.trigger_jobs + WHERE status = 'enabled' + AND trigger_condition = 'event' + AND trigger_config->>'event_name' = %s + """, + (event_name,), + ) + rows = cur.fetchall() + + for job_id, job_type, job_name in rows: + handler = _JOB_REGISTRY.get(job_type) + if not handler: + logger.warning("未注册的 job_type: %s (job_name=%s)", job_type, job_name) + continue + try: + handler(payload=payload) + executed += 1 + # 更新 last_run_at + with conn.cursor() as cur: + cur.execute( + "UPDATE biz.trigger_jobs SET last_run_at = NOW() WHERE id = %s", + (job_id,), + ) + conn.commit() + except Exception: + logger.exception("触发器 %s 执行失败", job_name) + conn.rollback() + + finally: + conn.close() + return executed + + +def check_scheduled_jobs() -> int: + """ + 检查 cron/interval 类型的到期 job 并执行。 + + 由 Scheduler 后台循环调用。 + 返回: 执行的 job 数量 + """ + conn = get_connection() + executed = 0 + try: + with conn.cursor() as cur: + cur.execute( + """ + SELECT id, job_type, job_name, trigger_condition, trigger_config + FROM biz.trigger_jobs + WHERE status = 'enabled' + AND trigger_condition IN ('cron', 'interval') + AND (next_run_at IS NULL OR next_run_at <= NOW()) + ORDER BY next_run_at ASC NULLS FIRST + """, + ) + rows = cur.fetchall() + + for job_id, job_type, job_name, trigger_condition, trigger_config in rows: + handler = _JOB_REGISTRY.get(job_type) + if not handler: + logger.warning("未注册的 job_type: %s", job_type) + continue + try: + handler() + executed += 1 + # 计算 next_run_at 并更新 + next_run = _calculate_next_run(trigger_condition, trigger_config) + with conn.cursor() as cur: + cur.execute( + """ + UPDATE biz.trigger_jobs + SET last_run_at = NOW(), next_run_at = %s + WHERE id = %s + """, + (next_run, job_id), + ) + conn.commit() + except Exception: + logger.exception("触发器 %s 执行失败", job_name) + conn.rollback() + + finally: + conn.close() + return executed + + +def _calculate_next_run(trigger_condition: str, trigger_config: dict) -> datetime | None: + """根据触发条件和配置计算下次运行时间。""" + now = datetime.now(timezone.utc) + if trigger_condition == "interval": + seconds = trigger_config.get("interval_seconds", 3600) + from datetime import timedelta + return now + timedelta(seconds=seconds) + elif trigger_condition == "cron": + # 复用现有 scheduler._parse_simple_cron + from app.services.scheduler import _parse_simple_cron + return _parse_simple_cron(trigger_config.get("cron_expression", "0 4 * * *"), now) + return None # event 类型无 next_run_at +``` + +### 组件 2:任务生成器(services/task_generator.py) + +**职责**:每日 4:00 运行,基于 WBI/NCI/RS 指数为每个助教生成/更新任务。 + +**核心逻辑**(纯函数,可独立测试): + +```python +from decimal import Decimal +from dataclasses import dataclass +from enum import IntEnum + + +class TaskPriority(IntEnum): + """任务类型优先级,数值越小优先级越高。""" + HIGH_PRIORITY_RECALL = 0 + PRIORITY_RECALL = 0 + FOLLOW_UP_VISIT = 1 + RELATIONSHIP_BUILDING = 2 + + +TASK_TYPE_PRIORITY = { + "high_priority_recall": TaskPriority.HIGH_PRIORITY_RECALL, + "priority_recall": TaskPriority.PRIORITY_RECALL, + "follow_up_visit": TaskPriority.FOLLOW_UP_VISIT, + "relationship_building": TaskPriority.RELATIONSHIP_BUILDING, +} + + +@dataclass +class IndexData: + """某客户-助教对的指数数据。""" + site_id: int + assistant_id: int + member_id: int + wbi: Decimal # 流失回赢指数 + nci: Decimal # 新客转化指数 + rs: Decimal # 关系强度指数 + has_active_recall: bool # 是否有活跃召回任务 + has_follow_up_note: bool # 召回完成后是否有回访备注 + + +def determine_task_type(index_data: IndexData) -> str | None: + """ + 根据指数数据确定应生成的任务类型。 + + 优先级规则(高 → 低): + 1. max(WBI, NCI) > 7 → high_priority_recall + 2. max(WBI, NCI) > 5 → priority_recall + 3. 完成召回后无回访备注 → follow_up_visit + 4. RS < 6 → relationship_building + 5. 不满足任何条件 → None(不生成任务) + + 返回: task_type 字符串或 None + """ + priority_score = max(index_data.wbi, index_data.nci) + + if priority_score > 7: + return "high_priority_recall" + if priority_score > 5: + return "priority_recall" + if index_data.has_active_recall is False and index_data.has_follow_up_note is False: + # 完成召回后无回访备注 → 回访任务 + # 注意:此条件需要外部传入是否"已完成召回"的状态 + pass + if index_data.rs < 6: + return "relationship_building" + return None + + +def should_replace_task( + existing_type: str, new_type: str +) -> bool: + """ + 判断新任务类型是否应替换现有任务类型。 + + 规则:高优先级任务覆盖低优先级任务。 + 同优先级(如 high_priority_recall 和 priority_recall 都是 0)时, + 类型不同则替换。 + """ + if existing_type == new_type: + return False + return True # 类型不同即替换 +``` + +**执行流程**: + +```python +def run(self) -> dict: + """ + 任务生成器主流程。 + + 1. 通过 auth.user_assistant_binding 获取所有已绑定助教 + 2. 对每个助教,通过 FDW 读取 WBI/NCI/RS 指数 + 3. 调用 determine_task_type() 确定任务类型 + 4. 检查是否已存在相同 (site_id, assistant_id, member_id, task_type) 的 active 任务 + - 存在 → 跳过 + 5. 检查是否已存在相同 (site_id, assistant_id, member_id) 但不同 task_type 的 active 任务 + - 存在 → 关闭旧任务 + 创建新任务 + 记录 history + 6. 不存在 → 创建新任务 + 7. 更新 trigger_jobs 时间戳 + + 返回: {"created": int, "replaced": int, "skipped": int} + """ + ... +``` + +### 组件 3:任务管理服务(services/task_manager.py) + +**职责**:任务 CRUD、置顶、放弃、取消操作。 + +```python +async def get_task_list( + user_id: int, site_id: int +) -> list[dict]: + """ + 获取助教的活跃任务列表。 + + 1. 通过 auth.user_assistant_binding 获取 assistant_id + 2. 查询 biz.coach_tasks WHERE site_id=? AND assistant_id=? AND status='active' + 3. JOIN fdw_etl.v_dim_member 获取客户基本信息 + 4. JOIN fdw_etl.v_dws_member_assistant_relation_index 获取 RS 指数 + 5. 计算爱心 icon 档位:💖>8.5 / 🧡>7 / 💛>5 / 💙<5 + 6. 排序:is_pinned DESC, priority_score DESC, created_at ASC + + 注意:FDW 查询需要 SET LOCAL app.current_site_id。 + """ + ... + + +async def pin_task(task_id: int, user_id: int, site_id: int) -> dict: + """置顶任务。验证任务归属后设置 is_pinned=TRUE,记录 history。""" + ... + + +async def unpin_task(task_id: int, user_id: int, site_id: int) -> dict: + """取消置顶。验证任务归属后设置 is_pinned=FALSE。""" + ... + + +async def abandon_task( + task_id: int, user_id: int, site_id: int, reason: str +) -> dict: + """ + 放弃任务。 + + 1. 验证 reason 非空(空则 422) + 2. 验证任务归属和 status='active' + 3. 设置 status='abandoned', abandon_reason=reason + 4. 记录 coach_task_history + """ + ... + + +async def cancel_abandon(task_id: int, user_id: int, site_id: int) -> dict: + """ + 取消放弃。 + + 1. 验证任务归属和 status='abandoned' + 2. 恢复 status='active', 清空 abandon_reason + 3. 记录 coach_task_history + """ + ... + + +def _record_history( + conn, task_id: int, action: str, + old_status: str = None, new_status: str = None, + old_task_type: str = None, new_task_type: str = None, + detail: dict = None, +) -> None: + """在 coach_task_history 中记录变更。""" + ... +``` + +### 组件 4:有效期轮询器(services/task_expiry.py) + +**职责**:每小时检查 `expires_at` 不为 NULL 且已过期的任务,标记为 inactive。 + +```python +def run() -> dict: + """ + 有效期轮询主流程。 + + 1. SELECT id FROM biz.coach_tasks + WHERE expires_at IS NOT NULL AND expires_at < NOW() AND status = 'active' + 2. UPDATE status = 'inactive' + 3. INSERT coach_task_history (action='expired') + + 返回: {"expired_count": int} + """ + ... +``` + +### 组件 5:召回完成检测器(services/recall_detector.py) + +**职责**:ETL 数据更新后,检测助教服务记录并匹配活跃任务。 + +```python +def run(payload: dict | None = None) -> dict: + """ + 召回完成检测主流程。 + + 1. 通过 FDW 读取 fdw_etl.v_dwd_assistant_service_log 中的新增服务记录 + (基于 last_run_at 过滤增量) + 2. 对每条服务记录,查找 biz.coach_tasks 中匹配的 + (site_id, assistant_id, member_id) 且 status='active' 的任务 + 3. 将匹配任务标记为 completed: + - status = 'completed' + - completed_at = 服务时间 + - completed_task_type = 当前 task_type(快照) + 4. 记录 coach_task_history + 5. 触发 fire_event('recall_completed', {site_id, assistant_id, member_id, service_time}) + + 返回: {"completed_count": int} + """ + ... +``` + +### 组件 6:备注回溯重分类器(services/note_reclassifier.py) + +**职责**:召回完成后,回溯检查是否有普通备注需重分类为回访备注。 + +```python +def run(payload: dict | None = None) -> dict: + """ + 备注回溯主流程。 + + payload 包含: {site_id, assistant_id, member_id, service_time} + + 1. 查找 biz.notes 中该 (site_id, target_type='member', target_id=member_id) + 在 service_time 之后提交的第一条 type='normal' 的备注 + 2. 将该备注 type 从 'normal' 更新为 'follow_up' + 3. 触发 AI 应用 6 接口(P5 实现,本 SPEC 仅定义触发接口): + - 调用 ai_analyze_note(note_id) → 返回 ai_score + 4. 若 ai_score >= 6: + - 生成 follow_up_visit 任务,status='completed'(回溯完成) + 5. 若 ai_score < 6: + - 生成 follow_up_visit 任务,status='active'(需助教重新备注) + + 返回: {"reclassified_count": int, "tasks_created": int} + """ + ... + + +def ai_analyze_note(note_id: int) -> int | None: + """ + AI 应用 6 备注分析接口(占位)。 + + P5 AI 集成层实现后替换此占位函数。 + 当前返回 None 表示 AI 未就绪,跳过评分逻辑。 + """ + return None +``` + +### 组件 7:备注服务(services/note_service.py) + +**职责**:备注 CRUD、星星评分存储与读取。 + +```python +async def create_note( + site_id: int, + user_id: int, + target_type: str, # 'member' + target_id: int, # member_id + content: str, + task_id: int | None = None, + rating_service_willingness: int | None = None, # 1-5 + rating_revisit_likelihood: int | None = None, # 1-5 +) -> dict: + """ + 创建备注。 + + 1. 验证评分范围(1-5 或 NULL),不合法则 422 + 2. 确定 note type: + - 若 task_id 关联的任务 task_type='follow_up_visit' → type='follow_up' + - 否则 → type='normal' + 3. INSERT INTO biz.notes + 4. 若 type='follow_up': + - 触发 AI 应用 6 分析(P5 实现) + - 若 ai_score >= 6 且关联任务 status='active' → 标记任务 completed + 5. 返回创建的备注记录 + + 注意:星星评分不参与回访完成判定,不参与 AI 分析,仅存储。 + """ + ... + + +async def get_notes( + site_id: int, target_type: str, target_id: int +) -> list[dict]: + """ + 查询某目标的备注列表。 + + 按 created_at DESC 排序,包含星星评分和 AI 评分。 + """ + ... + + +async def delete_note(note_id: int, user_id: int, site_id: int) -> dict: + """ + 删除备注。 + + 验证备注归属后执行硬删除。 + """ + ... +``` + +### 组件 8:路由端点 + +#### 8.1 小程序任务路由(routers/xcx_tasks.py) + +| 方法 | 路径 | 说明 | 认证要求 | +|------|------|------|---------| +| GET | `/api/xcx/tasks` | 获取任务列表 | JWT(approved) | +| POST | `/api/xcx/tasks/{id}/pin` | 置顶任务 | JWT(approved) | +| POST | `/api/xcx/tasks/{id}/unpin` | 取消置顶 | JWT(approved) | +| POST | `/api/xcx/tasks/{id}/abandon` | 放弃任务 | JWT(approved) | +| POST | `/api/xcx/tasks/{id}/cancel-abandon` | 取消放弃 | JWT(approved) | + +#### 8.2 小程序备注路由(routers/xcx_notes.py) + +| 方法 | 路径 | 说明 | 认证要求 | +|------|------|------|---------| +| POST | `/api/xcx/notes` | 创建备注 | JWT(approved) | +| GET | `/api/xcx/notes` | 查询备注列表(query: target_type, target_id) | JWT(approved) | +| DELETE | `/api/xcx/notes/{id}` | 删除备注 | JWT(approved) | + +### 组件 9:Pydantic 模型 + +```python +# schemas/xcx_tasks.py +from pydantic import BaseModel, Field + +class TaskListItem(BaseModel): + id: int + task_type: str + status: str + priority_score: float | None + is_pinned: bool + expires_at: str | None + created_at: str + # 客户信息(FDW 读取) + member_id: int + member_name: str | None + member_phone: str | None + # RS 指数 + 爱心 icon + rs_score: float | None + heart_icon: str # 💖 / 🧡 / 💛 / 💙 + +class AbandonRequest(BaseModel): + reason: str = Field(..., min_length=1, description="放弃原因(必填)") + + +# schemas/xcx_notes.py +from pydantic import BaseModel, Field + +class NoteCreateRequest(BaseModel): + target_type: str = Field(default="member") + target_id: int + content: str = Field(..., min_length=1) + task_id: int | None = None + rating_service_willingness: int | None = Field(None, ge=1, le=5) + rating_revisit_likelihood: int | None = Field(None, ge=1, le=5) + +class NoteOut(BaseModel): + id: int + type: str + content: str + rating_service_willingness: int | None + rating_revisit_likelihood: int | None + ai_score: int | None + ai_analysis: str | None + task_id: int | None + created_at: str + updated_at: str +``` + + +## 数据模型 + +### ER 图 + +```mermaid +erDiagram + coach_tasks { + bigserial id PK + bigint site_id "NOT NULL" + bigint assistant_id "NOT NULL" + bigint member_id "NOT NULL" + varchar task_type "NOT NULL (4种)" + varchar status "NOT NULL DEFAULT 'active'" + numeric_5_2 priority_score "max(WBI,NCI) 快照" + timestamptz expires_at "可空,有效期" + boolean is_pinned "DEFAULT FALSE" + text abandon_reason "可空" + timestamptz completed_at "可空" + varchar completed_task_type "可空,完成时类型快照" + bigint parent_task_id "可空,FK → coach_tasks" + timestamptz created_at "DEFAULT NOW()" + timestamptz updated_at "DEFAULT NOW()" + } + + coach_task_history { + bigserial id PK + bigint task_id "FK → coach_tasks" + varchar action "NOT NULL" + varchar old_status "可空" + varchar new_status "可空" + varchar old_task_type "可空" + varchar new_task_type "可空" + jsonb detail "可空" + timestamptz created_at "DEFAULT NOW()" + } + + notes { + bigserial id PK + bigint site_id "NOT NULL" + integer user_id "NOT NULL" + varchar target_type "NOT NULL" + bigint target_id "NOT NULL" + varchar type "NOT NULL DEFAULT 'normal'" + text content "NOT NULL" + smallint rating_service_willingness "可空 CHECK 1-5" + smallint rating_revisit_likelihood "可空 CHECK 1-5" + bigint task_id "可空 FK → coach_tasks" + smallint ai_score "可空" + text ai_analysis "可空" + timestamptz created_at "DEFAULT NOW()" + timestamptz updated_at "DEFAULT NOW()" + } + + trigger_jobs { + serial id PK + varchar job_type "NOT NULL" + varchar job_name "NOT NULL UNIQUE" + varchar trigger_condition "NOT NULL (cron/interval/event)" + jsonb trigger_config "NOT NULL" + timestamptz last_run_at "可空" + timestamptz next_run_at "可空" + varchar status "NOT NULL DEFAULT 'enabled'" + timestamptz created_at "DEFAULT NOW()" + } + + coach_tasks ||--o{ coach_task_history : "变更记录" + coach_tasks ||--o{ notes : "关联备注" + coach_tasks ||--o| coach_tasks : "parent_task_id" +``` + +### 表 DDL + +所有表在 `biz` Schema 下,迁移脚本位于 `db/zqyy_app/migrations/`。 + +#### biz.coach_tasks + +```sql +CREATE TABLE IF NOT EXISTS biz.coach_tasks ( + id BIGSERIAL PRIMARY KEY, + site_id BIGINT NOT NULL, + assistant_id BIGINT NOT NULL, + member_id BIGINT NOT NULL, + task_type VARCHAR(50) NOT NULL, + status VARCHAR(20) NOT NULL DEFAULT 'active', + priority_score NUMERIC(5,2), + expires_at TIMESTAMPTZ, + is_pinned BOOLEAN DEFAULT FALSE, + abandon_reason TEXT, + completed_at TIMESTAMPTZ, + completed_task_type VARCHAR(50), + parent_task_id BIGINT REFERENCES biz.coach_tasks(id), + created_at TIMESTAMPTZ DEFAULT NOW(), + updated_at TIMESTAMPTZ DEFAULT NOW() +); + +-- 部分唯一索引:同一 (site_id, assistant_id, member_id, task_type) 下 active 任务最多一条 +CREATE UNIQUE INDEX IF NOT EXISTS idx_coach_tasks_site_assistant_member_type + ON biz.coach_tasks (site_id, assistant_id, member_id, task_type) + WHERE status = 'active'; + +-- 助教任务列表查询索引 +CREATE INDEX IF NOT EXISTS idx_coach_tasks_assistant_status + ON biz.coach_tasks (site_id, assistant_id, status); +``` + +#### biz.coach_task_history + +```sql +CREATE TABLE IF NOT EXISTS biz.coach_task_history ( + id BIGSERIAL PRIMARY KEY, + task_id BIGINT NOT NULL REFERENCES biz.coach_tasks(id), + action VARCHAR(50) NOT NULL, + old_status VARCHAR(20), + new_status VARCHAR(20), + old_task_type VARCHAR(50), + new_task_type VARCHAR(50), + detail JSONB, + created_at TIMESTAMPTZ DEFAULT NOW() +); +``` + +#### biz.notes + +```sql +CREATE TABLE IF NOT EXISTS biz.notes ( + id BIGSERIAL PRIMARY KEY, + site_id BIGINT NOT NULL, + user_id INTEGER NOT NULL, + target_type VARCHAR(50) NOT NULL, + target_id BIGINT NOT NULL, + type VARCHAR(20) NOT NULL DEFAULT 'normal', + content TEXT NOT NULL, + rating_service_willingness SMALLINT CHECK (rating_service_willingness BETWEEN 1 AND 5), + rating_revisit_likelihood SMALLINT CHECK (rating_revisit_likelihood BETWEEN 1 AND 5), + task_id BIGINT REFERENCES biz.coach_tasks(id), + ai_score SMALLINT, + ai_analysis TEXT, + created_at TIMESTAMPTZ DEFAULT NOW(), + updated_at TIMESTAMPTZ DEFAULT NOW() +); + +-- 按目标查询备注索引 +CREATE INDEX IF NOT EXISTS idx_notes_target + ON biz.notes (site_id, target_type, target_id); +``` + +#### biz.trigger_jobs + +```sql +CREATE TABLE IF NOT EXISTS biz.trigger_jobs ( + id SERIAL PRIMARY KEY, + job_type VARCHAR(100) NOT NULL, + job_name VARCHAR(100) NOT NULL UNIQUE, + trigger_condition VARCHAR(20) NOT NULL, + trigger_config JSONB NOT NULL, + last_run_at TIMESTAMPTZ, + next_run_at TIMESTAMPTZ, + status VARCHAR(20) NOT NULL DEFAULT 'enabled', + created_at TIMESTAMPTZ DEFAULT NOW() +); +``` + +### 种子数据 + +```sql +-- 预置触发器配置 +INSERT INTO biz.trigger_jobs (job_type, job_name, trigger_condition, trigger_config, next_run_at) +VALUES + ('task_generator', 'task_generator', 'cron', + '{"cron_expression": "0 4 * * *"}', + (CURRENT_DATE + 1) + INTERVAL '4 hours'), + + ('task_expiry_check', 'task_expiry_check', 'interval', + '{"interval_seconds": 3600}', + NOW() + INTERVAL '1 hour'), + + ('recall_completion_check', 'recall_completion_check', 'event', + '{"event_name": "etl_data_updated"}', + NULL), + + ('note_reclassify_backfill', 'note_reclassify_backfill', 'event', + '{"event_name": "recall_completed"}', + NULL) +ON CONFLICT (job_name) DO NOTHING; +``` + +### 迁移脚本清单 + +| 序号 | 文件名 | 内容 | +|------|--------|------| +| 1 | `YYYY-MM-DD__p4_create_biz_tables.sql` | 创建 coach_tasks + coach_task_history + notes + trigger_jobs 表及索引 | +| 2 | `YYYY-MM-DD__p4_seed_trigger_jobs.sql` | 预置 4 条触发器配置种子数据 | + +### FDW 数据依赖 + +任务生成器和召回检测器通过 `fdw_etl` Schema 读取以下外部表(P1 已建立映射): + +| 外部表 | 用途 | +|--------|------| +| `fdw_etl.v_dws_member_winback_index` | WBI 流失回赢指数 | +| `fdw_etl.v_dws_member_newconv_index` | NCI 新客转化指数 | +| `fdw_etl.v_dws_member_assistant_relation_index` | RS 关系强度指数 | +| `fdw_etl.v_dwd_assistant_service_log` | 助教服务记录(召回检测) | +| `fdw_etl.v_dim_member` | 客户基本信息(任务列表展示) | + +**FDW 查询模式**:所有 FDW 查询通过业务库连接(`get_connection()`),在事务中 `SET LOCAL app.current_site_id = %s` 设置 RLS 隔离后查询 `fdw_etl.*` 外部表。 + + +## 正确性属性(Correctness Properties) + +*属性是系统在所有有效执行中都应保持为真的特征或行为——本质上是关于系统应该做什么的形式化陈述。属性是人类可读规格与机器可验证正确性保证之间的桥梁。* + +### Property 1:任务类型确定正确性 + +*For any* 指数数据组合(WBI ≥ 0, NCI ≥ 0, RS ≥ 0),`determine_task_type()` 应满足: +- 当 `max(WBI, NCI) > 7` 时返回 `high_priority_recall` +- 当 `5 < max(WBI, NCI) ≤ 7` 时返回 `priority_recall` +- 当 `RS < 6` 且不满足上述条件时返回 `relationship_building` +- 且 `priority_score` 始终等于 `max(WBI, NCI)` + +**Validates: Requirements 3.1, 3.2, 3.3, 3.5** + +### Property 2:活跃任务唯一性不变量 + +*For any* `(site_id, assistant_id, member_id, task_type)` 组合,`biz.coach_tasks` 中 `status = 'active'` 的记录最多只有一条。任何创建或状态变更操作都不应违反此约束。 + +**Validates: Requirements 1.5, 3.6, 14.1** + +### Property 3:任务类型变更状态机 + +*For any* 已存在 `(site_id, assistant_id, member_id)` 的 active 任务,当新任务类型与现有类型不同时,执行类型变更后:旧任务 `status` 变为 `inactive`,新任务 `status` 为 `active`,且 `coach_task_history` 中存在对应的变更记录(包含 `old_task_type` 和 `new_task_type`)。 + +**Validates: Requirements 3.7, 5.1, 5.4, 14.2** + +### Property 4:48 小时滞留机制 + +*For any* `follow_up_visit` 类型任务: +- 生成时 `expires_at` 为 NULL,`status` 为 `active` +- 当触发条件不再满足时,`expires_at` 被填充为 `created_at + 48 小时` +- 当 `expires_at` 不为 NULL 且当前时间超过 `expires_at` 时,轮询后 `status` 变为 `inactive` +- 当新 `follow_up_visit` 任务顶替有 `expires_at` 的旧任务时,旧任务变为 `inactive`,新任务 `expires_at` 为 NULL + +**Validates: Requirements 4.1, 4.2, 4.3, 4.4, 14.3** + +### Property 5:放弃与取消放弃往返 + +*For any* `status = 'active'` 的任务和非空的放弃原因字符串: +- 放弃操作后 `status = 'abandoned'` 且 `abandon_reason` 等于提供的原因 +- 取消放弃后 `status = 'active'` 且 `abandon_reason` 为空 +- 放弃时若 `abandon_reason` 为空字符串或纯空白,应返回 422 错误且任务状态不变 + +**Validates: Requirements 8.4, 8.6, 8.7, 14.4** + +### Property 6:召回完成检测与类型快照 + +*For any* `(site_id, assistant_id, member_id)` 组合,当检测到助教为该客户提供了服务时,所有匹配的 `status = 'active'` 任务应变为 `completed`,且 `completed_at` 记录服务时间,`completed_task_type` 记录完成时的 `task_type` 值(快照不变量)。 + +**Validates: Requirements 6.2, 6.3, 14.6** + +### Property 7:备注回溯重分类 + +*For any* 召回完成事件(包含 site_id, assistant_id, member_id, service_time),若在 service_time 之后存在该客户的 `type = 'normal'` 备注,回溯操作后该备注的 `type` 应变为 `follow_up`。若不存在符合条件的备注,则不做任何修改。 + +**Validates: Requirements 7.1, 7.2, 14.7** + +### Property 8:备注类型自动设置 + +*For any* 备注创建操作: +- 若关联的 `task_id` 对应任务的 `task_type = 'follow_up_visit'`,则备注 `type` 应为 `follow_up` +- 若关联的 `task_id` 对应任务的 `task_type` 不是 `follow_up_visit`,则备注 `type` 应为 `normal` +- 若未关联 `task_id`,则备注 `type` 应为 `normal` + +**Validates: Requirements 9.2, 9.3** + +### Property 9:星星评分范围约束 + +*For any* 备注创建操作,`rating_service_willingness` 和 `rating_revisit_likelihood` 的值应为 NULL 或在 1-5 范围内。范围外的值应被拒绝(422 错误),且备注不被创建。 + +**Validates: Requirements 9.8, 14.5** + +### Property 10:任务列表排序正确性 + +*For any* 助教的活跃任务列表,返回结果应按 `is_pinned DESC, priority_score DESC, created_at ASC` 排序。即:置顶任务在前,同置顶状态下高优先级在前,同优先级下先创建的在前。 + +**Validates: Requirements 8.1** + +### Property 11:爱心 icon 档位计算 + +*For any* RS 指数值,爱心 icon 应满足: +- RS > 8.5 → 💖 +- 7 < RS ≤ 8.5 → 🧡 +- 5 < RS ≤ 7 → 💛 +- RS ≤ 5 → 💙 + +**Validates: Requirements 8.2** + +### Property 12:触发器 next_run_at 计算 + +*For any* cron 类型触发器配置和当前时间,计算的 `next_run_at` 应大于当前时间。*For any* interval 类型触发器配置(interval_seconds > 0),计算的 `next_run_at` 应等于当前时间加上 interval_seconds 秒。 + +**Validates: Requirements 10.1, 10.2** + +### Property 13:迁移脚本幂等性 + +*For any* 本次新增的迁移脚本(DDL + 种子数据),连续执行两次的结果应与执行一次相同——第二次执行不应产生错误,且数据库状态不变。 + +**Validates: Requirements 1.8, 2.5, 11.4, 11.5** + +### Property 14:AI 评分驱动的任务完成判定 + +*For any* `type = 'follow_up'` 的备注和关联的 `follow_up_visit` 任务: +- 当 AI 应用 6 返回评分 ≥ 6 且任务 `status = 'active'` 时,任务应标记为 `completed` +- 当 AI 应用 6 返回评分 < 6 时,任务 `status` 保持 `active` + +**Validates: Requirements 7.4, 7.5, 9.5** + +### Property 15:状态变更历史完整性 + +*For any* 任务状态变更操作(类型变更、放弃、取消放弃、完成、过期),`coach_task_history` 中应存在对应记录,包含正确的 `action`、`old_status`、`new_status` 字段。 + +**Validates: Requirements 5.4, 8.3** + + +## 错误处理 + +### API 错误码规范 + +| HTTP 状态码 | 场景 | 响应体 | +|------------|------|--------| +| 401 | JWT 无效/过期 | `{"detail": "无效的令牌"}` | +| 403 | 用户未 approved、操作非自己的任务 | `{"detail": "权限不足"}` | +| 404 | 任务/备注不存在 | `{"detail": "资源不存在"}` | +| 409 | 任务状态不允许操作(如对 inactive 任务置顶) | `{"detail": "任务状态不允许此操作"}` | +| 422 | 放弃原因为空、星星评分超范围、请求体校验失败 | Pydantic 标准错误格式 | +| 500 | 数据库连接失败、FDW 查询异常 | `{"detail": "服务器内部错误"}` | + +### 后台任务错误处理 + +| 场景 | 处理方式 | +|------|---------| +| FDW 查询失败(ETL 库不可用) | 记录错误日志,跳过本次执行,不影响其他触发器 | +| 单个任务生成失败 | 捕获异常,记录日志,继续处理下一个助教-客户对 | +| 触发器执行异常 | 记录错误日志,不中断其他触发器执行(需求 11.7) | +| 数据库写入冲突(唯一索引) | 捕获 `UniqueViolation`,视为"已存在"跳过 | +| AI 应用 6 接口不可用 | 跳过评分逻辑,备注正常创建,ai_score 保持 NULL | +| expires_at 计算溢出 | 防御性检查,确保 expires_at > created_at | + +### 数据库事务策略 + +| 操作 | 事务范围 | +|------|---------| +| 任务生成器(批量) | 每个助教-客户对独立事务,失败不影响其他 | +| 任务状态变更 | 单任务事务:UPDATE task + INSERT history 在同一事务 | +| 备注创建 + 任务完成 | 同一事务:INSERT note + UPDATE task(如触发完成) | +| 触发器调度 | 每个 job 独立事务 | +| FDW 查询 | 只读事务,`SET LOCAL app.current_site_id` 设置 RLS | + +### 环境变量缺失处理 + +| 变量 | 缺失时行为 | +|------|-----------| +| `DB_HOST` 等数据库参数 | 数据库连接失败,返回 500 | +| `ETL_DB_HOST` 等 ETL 参数 | FDW 查询失败,任务生成器/召回检测器跳过,记录日志 | +| `JWT_SECRET_KEY` | JWT 签发/验证使用空密钥(仅开发环境) | + +## 测试策略 + +### 属性测试(Property-Based Testing) + +使用 Python `hypothesis` 框架,测试目录:`tests/`(Monorepo 级属性测试目录)。 + +每个属性测试至少运行 100 次迭代。每个测试用注释标注对应的设计属性编号。 + +标注格式:`# Feature: 04-miniapp-core-business, Property N: <属性标题>` + +**核心纯函数(可直接属性测试,不依赖数据库)**: + +| 属性 | 测试文件 | 测试方法 | 生成器 | +|------|---------|---------|--------| +| P1 任务类型确定 | `tests/test_core_business_properties.py` | 生成随机 WBI/NCI/RS 值,验证 `determine_task_type()` 返回值 | `hypothesis.strategies.decimals(min_value=0, max_value=10, places=2)` | +| P9 评分范围约束 | `tests/test_core_business_properties.py` | 生成随机整数,验证 Pydantic 模型校验 | `hypothesis.strategies.integers(min_value=-100, max_value=100)` | +| P10 任务列表排序 | `tests/test_core_business_properties.py` | 生成随机任务列表(不同 is_pinned/priority_score/created_at),验证排序 | 自定义 strategy 生成任务列表 | +| P11 爱心 icon 档位 | `tests/test_core_business_properties.py` | 生成随机 RS 值,验证 icon 映射 | `hypothesis.strategies.decimals(min_value=0, max_value=10, places=1)` | +| P12 next_run_at 计算 | `tests/test_core_business_properties.py` | 生成随机 cron/interval 配置和当前时间,验证计算结果 | 自定义 strategy 生成触发器配置 | + +**状态机属性(需要 FakeDB 模拟数据库状态)**: + +| 属性 | 测试文件 | 测试方法 | 生成器 | +|------|---------|---------|--------| +| P2 唯一性不变量 | `tests/test_core_business_properties.py` | 生成随机 (site_id, assistant_id, member_id, task_type) 组合,模拟插入,验证唯一性 | 自定义 strategy 生成任务组合 | +| P3 类型变更状态机 | `tests/test_core_business_properties.py` | 生成随机现有任务+新任务类型,执行变更,验证旧任务 inactive + 新任务 active + history | 自定义 strategy | +| P4 48h 滞留机制 | `tests/test_core_business_properties.py` | 生成随机 follow_up_visit 任务+时间偏移,验证 expires_at 填充和过期逻辑 | `hypothesis.strategies.datetimes` + `timedeltas` | +| P5 放弃往返 | `tests/test_core_business_properties.py` | 生成随机任务+放弃原因,执行放弃→取消放弃,验证状态恢复 | `hypothesis.strategies.text(min_size=1)` | +| P6 召回完成+快照 | `tests/test_core_business_properties.py` | 生成随机 active 任务+服务记录,执行完成检测,验证 completed_task_type | 自定义 strategy | +| P7 备注回溯 | `tests/test_core_business_properties.py` | 生成随机备注列表+service_time,执行回溯,验证 type 变更 | 自定义 strategy | +| P8 备注类型自动设置 | `tests/test_core_business_properties.py` | 生成随机 task_type + 备注创建,验证 note.type | `hypothesis.strategies.sampled_from(task_types)` | +| P14 AI 评分判定 | `tests/test_core_business_properties.py` | 生成随机 ai_score + 任务状态,验证完成判定 | `hypothesis.strategies.integers(min_value=0, max_value=10)` | +| P15 历史完整性 | `tests/test_core_business_properties.py` | 生成随机状态变更操作序列,验证 history 记录数量和内容 | 自定义 strategy | + +**注意**:P13(迁移幂等性)作为集成测试在测试库中执行,不使用 hypothesis。 + +### 单元测试 + +单元测试位于 `apps/backend/tests/`,聚焦于: + +- `test_task_generator.py`:任务生成器的边界情况(无指数数据、全部跳过、全部替换) +- `test_task_manager.py`:任务 CRUD 的边界情况(操作非自己的任务、状态不允许的操作) +- `test_note_service.py`:备注服务的边界情况(无关联任务、AI 接口不可用) +- `test_task_expiry.py`:有效期轮询的边界情况(无过期任务、批量过期) +- `test_recall_detector.py`:召回检测的边界情况(无匹配任务、多任务匹配) +- `test_note_reclassifier.py`:备注回溯的边界情况(无符合条件备注、多条备注取第一条) +- `test_trigger_scheduler.py`:触发器调度的边界情况(disabled job、未注册 handler) + +### 集成测试 + +集成测试通过以下方式验证: + +1. **迁移脚本幂等性**:在 `test_zqyy_app` 中连续执行两次迁移脚本,验证无错误 +2. **种子数据完整性**:验证 4 条触发器配置正确插入 +3. **端到端流程**:任务生成 → 任务列表 → 置顶/放弃 → 备注创建 → 召回完成 → 备注回溯 + +### 测试配置 + +- 属性测试:`cd C:\NeoZQYY && pytest tests/test_core_business_properties.py -v` +- 后端单元测试:`cd apps/backend && pytest tests/ -v` +- 每个属性测试标注 `@settings(max_examples=200)` +- 每个属性测试注释引用设计文档 Property 编号 +- 属性测试库:`hypothesis`(已在项目依赖中) +- 数据库测试使用 `test_zqyy_app`(禁止连正式库) diff --git a/.kiro/specs/04-miniapp-core-business/requirements.md b/.kiro/specs/04-miniapp-core-business/requirements.md new file mode 100644 index 0000000..285a913 --- /dev/null +++ b/.kiro/specs/04-miniapp-core-business/requirements.md @@ -0,0 +1,278 @@ +# 需求文档:小程序核心业务模块(miniapp-core-business) + +## 简介 + +本 SPEC 实现小程序的核心业务逻辑,涵盖助教任务系统(生成、分配、状态流转、完成检测)、备注系统(CRUD、星星评分、类型区分)、以及后台触发器/轮询调度框架。系统基于 P1(miniapp-db-foundation)的数据库基础设施、P2(etl-dws-miniapp-extensions)的 DWS 指数数据、P3(miniapp-auth-system)的用户认证体系,在 `test_zqyy_app.biz` Schema 中创建任务、备注、触发器等业务表,并在 FastAPI 后端实现对应的 API 端点和后台调度逻辑。 + +## 术语表 + +- **Task_Generator**:任务生成器,每日 4:00 后运行,基于 WBI/NCI/RS 指数为每个助教分配 4 种类型任务的后台服务 +- **Task_Manager**:任务管理服务,负责任务 CRUD、置顶、放弃、状态流转的后端模块 +- **Task_Expiry_Checker**:任务有效期轮询器,每小时检查 `expires_at` 并将过期任务标记为无效 +- **Recall_Completion_Detector**:召回完成检测器,ETL 数据更新后检查助教是否为匹配客户提供了服务 +- **Note_Reclassifier**:备注回溯重分类器,召回完成时回溯检查是否有普通备注需重分类为回访备注 +- **Note_Service**:备注服务模块,负责备注 CRUD、星星评分存储与读取 +- **Trigger_Scheduler**:触发器调度框架,支持 cron/interval/event 三种触发方式的统一调度引擎 +- **coach_tasks**:助教任务表,位于 `biz` Schema,存储任务分配、状态、有效期等信息 +- **coach_task_history**:任务变更历史表,记录任务关闭/新建的追溯链 +- **notes**:统一备注表,位于 `biz` Schema,通过 `type` 字段区分普通备注/回访备注/放弃原因 +- **trigger_jobs**:触发器配置表,位于 `biz` Schema,存储轮询/事件触发器的配置与执行状态 +- **task_type**:任务类型枚举,取值为 `high_priority_recall`(高优先召回)/ `priority_recall`(优先召回)/ `follow_up_visit`(客户回访)/ `relationship_building`(关系构建) +- **task_status**:任务状态枚举,取值为 `active`(有效)/ `inactive`(无效)/ `completed`(已完成)/ `abandoned`(已放弃) +- **note_type**:备注类型枚举,取值为 `normal`(普通备注)/ `follow_up`(回访备注)/ `abandon_reason`(放弃原因) +- **priority_score**:优先级分数,取 `max(WBI, NCI)` 的快照值,用于任务排序 +- **expires_at**:有效期时间戳,默认 NULL(无限期),填充后表示任务将在该时间点过期 +- **FDW**:`postgres_fdw` 外部数据包装器,通过 `fdw_etl` Schema 读取 ETL 库指数数据 +- **Migration_Script**:存放在 `db/zqyy_app/migrations/` 中的纯 SQL 迁移脚本,以日期前缀命名 +- **site_id**:门店标识符,类型为 `BIGINT`,用于多门店数据隔离 +- **member_retention_clue**:维客线索表,位于 `public` Schema,存储助教为客户记录的维护线索(大类 + 摘要 + 详情),独立于 ETL 数据。当前已有基础表结构和 CRUD API(`/api/retention-clue`),若不足以支撑本 SPEC 的任务系统需求,可对其 DDL、Pydantic 模型及路由进行扩展或修改 + +## 需求 + +### 需求 1:业务数据表创建 + +**用户故事:** 作为后端开发者,我需要在 `biz` Schema 中创建任务、备注、触发器等业务表,以便支撑核心业务功能。 + +#### 验收标准 + +1. WHEN Migration_Script 执行完成, THE Task_Manager SHALL 在 `biz` Schema 中创建 `coach_tasks` 表,包含 `id`(BIGSERIAL PK)、`site_id`(BIGINT NOT NULL)、`assistant_id`(BIGINT NOT NULL)、`member_id`(BIGINT NOT NULL)、`task_type`(VARCHAR NOT NULL)、`status`(VARCHAR NOT NULL DEFAULT 'active')、`priority_score`(NUMERIC(5,2))、`expires_at`(TIMESTAMPTZ,可空)、`is_pinned`(BOOLEAN DEFAULT FALSE)、`abandon_reason`(TEXT,可空)、`completed_at`(TIMESTAMPTZ,可空)、`completed_task_type`(VARCHAR,可空)、`parent_task_id`(BIGINT,可空,FK → coach_tasks)、`created_at`(TIMESTAMPTZ DEFAULT NOW())、`updated_at`(TIMESTAMPTZ DEFAULT NOW())字段 +2. WHEN Migration_Script 执行完成, THE Task_Manager SHALL 在 `biz` Schema 中创建 `coach_task_history` 表,包含 `id`(BIGSERIAL PK)、`task_id`(BIGINT FK → coach_tasks)、`action`(VARCHAR NOT NULL)、`old_status`(VARCHAR)、`new_status`(VARCHAR)、`old_task_type`(VARCHAR)、`new_task_type`(VARCHAR)、`detail`(JSONB)、`created_at`(TIMESTAMPTZ DEFAULT NOW())字段 +3. WHEN Migration_Script 执行完成, THE Note_Service SHALL 在 `biz` Schema 中创建 `notes` 表,包含 `id`(BIGSERIAL PK)、`site_id`(BIGINT NOT NULL)、`user_id`(INTEGER NOT NULL)、`target_type`(VARCHAR NOT NULL)、`target_id`(BIGINT NOT NULL)、`type`(VARCHAR NOT NULL DEFAULT 'normal')、`content`(TEXT NOT NULL)、`rating_service_willingness`(SMALLINT,可空,CHECK 1-5)、`rating_revisit_likelihood`(SMALLINT,可空,CHECK 1-5)、`task_id`(BIGINT,可空,FK → coach_tasks)、`ai_score`(SMALLINT,可空)、`ai_analysis`(TEXT,可空)、`created_at`(TIMESTAMPTZ DEFAULT NOW())、`updated_at`(TIMESTAMPTZ DEFAULT NOW())字段 +4. WHEN Migration_Script 执行完成, THE Trigger_Scheduler SHALL 在 `biz` Schema 中创建 `trigger_jobs` 表,包含 `id`(SERIAL PK)、`job_type`(VARCHAR NOT NULL)、`job_name`(VARCHAR NOT NULL UNIQUE)、`trigger_condition`(VARCHAR NOT NULL)、`trigger_config`(JSONB NOT NULL)、`last_run_at`(TIMESTAMPTZ,可空)、`next_run_at`(TIMESTAMPTZ,可空)、`status`(VARCHAR NOT NULL DEFAULT 'enabled')、`created_at`(TIMESTAMPTZ DEFAULT NOW())字段 +5. THE Migration_Script SHALL 对 `coach_tasks` 表创建唯一索引 `idx_coach_tasks_site_assistant_member_type` 在 `(site_id, assistant_id, member_id, task_type)` 上,仅对 `status = 'active'` 的记录生效(部分唯一索引) +6. THE Migration_Script SHALL 对 `coach_tasks` 表创建索引 `idx_coach_tasks_assistant_status` 在 `(site_id, assistant_id, status)` 上,用于助教任务列表查询 +7. THE Migration_Script SHALL 对 `notes` 表创建索引 `idx_notes_target` 在 `(site_id, target_type, target_id)` 上,用于按目标查询备注 +8. THE Migration_Script SHALL 使用 `IF NOT EXISTS` 幂等语法,确保重复执行不会报错 +9. THE Migration_Script SHALL 在脚本中包含回滚语句(以注释形式) + +### 需求 2:触发器种子数据预置 + +**用户故事:** 作为系统管理员,我需要系统预置核心触发器配置,以便后台调度任务自动运行。 + +#### 验收标准 + +1. WHEN 种子数据脚本执行完成, THE Trigger_Scheduler SHALL 在 `biz.trigger_jobs` 表中插入 `task_generator` 记录(trigger_condition='cron',trigger_config 包含 cron 表达式 '0 4 * * *') +2. WHEN 种子数据脚本执行完成, THE Trigger_Scheduler SHALL 在 `biz.trigger_jobs` 表中插入 `task_expiry_check` 记录(trigger_condition='interval',trigger_config 包含间隔秒数 3600) +3. WHEN 种子数据脚本执行完成, THE Trigger_Scheduler SHALL 在 `biz.trigger_jobs` 表中插入 `recall_completion_check` 记录(trigger_condition='event',trigger_config 包含事件名 'etl_data_updated') +4. WHEN 种子数据脚本执行完成, THE Trigger_Scheduler SHALL 在 `biz.trigger_jobs` 表中插入 `note_reclassify_backfill` 记录(trigger_condition='event',trigger_config 包含事件名 'recall_completed') +5. THE 种子数据脚本 SHALL 使用 `ON CONFLICT (job_name) DO NOTHING` 语法,确保重复执行不会产生重复数据 + +### 需求 3:任务生成器 + +**用户故事:** 作为助教,我每天打开小程序能看到系统为我分配的任务列表,按优先级排序。 + +#### 验收标准 + +1. WHEN Task_Generator 运行时, THE Task_Generator SHALL 通过 FDW 读取 `fdw_etl` 中的 `dws_member_winback_index`(WBI)和 `dws_member_newconv_index`(NCI)指数数据,计算 `priority_score = max(WBI, NCI)` +2. WHEN `priority_score > 7`, THE Task_Generator SHALL 为该客户-助教对生成 `high_priority_recall`(高优先召回)类型任务 +3. WHEN `priority_score > 5` 且 `priority_score <= 7`, THE Task_Generator SHALL 为该客户-助教对生成 `priority_recall`(优先召回)类型任务 +4. WHEN 助教完成某客户的召回任务后该客户无回访备注, THE Task_Generator SHALL 为该客户-助教对生成 `follow_up_visit`(客户回访)类型任务 +5. WHEN 客户-助教对的 RS 指数 < 6(通过 FDW 读取 `dws_member_assistant_relation_index`), THE Task_Generator SHALL 为该客户-助教对生成 `relationship_building`(关系构建)类型任务 +6. WHEN Task_Generator 生成任务时发现已存在相同 `(site_id, assistant_id, member_id, task_type)` 且 `status = 'active'` 的任务, THE Task_Generator SHALL 跳过该任务不做任何操作 +7. WHEN Task_Generator 生成任务时发现已存在相同 `(site_id, assistant_id, member_id)` 但 `task_type` 不同且 `status = 'active'` 的任务, THE Task_Generator SHALL 将旧任务状态设为 `inactive`,创建新任务,并在 `coach_task_history` 中记录变更 +8. THE Task_Generator SHALL 按优先级从高到低的顺序处理任务类型:`high_priority_recall`(0)> `priority_recall`(0)> `follow_up_visit`(1)> `relationship_building`(2),高优先级任务覆盖低优先级任务 +9. THE Task_Generator SHALL 通过 `auth.user_assistant_binding` 确定助教与小程序用户的映射关系,仅为已绑定的助教生成任务 +10. THE Task_Generator SHALL 在 `trigger_jobs` 中更新 `last_run_at` 和 `next_run_at` 时间戳 + +### 需求 4:48 小时回访滞留机制 + +**用户故事:** 作为系统,回访任务至少保留 48 小时,到期后自动失效。 + +#### 验收标准 + +1. WHEN Task_Generator 生成 `follow_up_visit` 类型任务时, THE Task_Generator SHALL 将 `expires_at` 设为 NULL(无限期有效),`status` 设为 `active` +2. WHEN Task_Generator 检测到某 `follow_up_visit` 任务的触发条件不再满足(指数变化), THE Task_Generator SHALL 将该任务的 `expires_at` 填充为 `created_at + 48 小时`,`status` 保持 `active` +3. WHEN Task_Expiry_Checker 轮询检查时发现某任务的 `expires_at` 不为 NULL 且当前时间超过 `expires_at`, THE Task_Expiry_Checker SHALL 将该任务 `status` 设为 `inactive` +4. WHEN 新的 `follow_up_visit` 任务生成时发现同一 `(site_id, assistant_id, member_id)` 已存在一个有 `expires_at` 的 `follow_up_visit` 任务, THE Task_Generator SHALL 将旧任务标记为 `inactive`,创建新的 `active` 任务(`expires_at` 为 NULL) +5. THE Task_Expiry_Checker SHALL 每小时运行一次,由 `trigger_jobs` 中的 `task_expiry_check` 配置驱动 + +### 需求 5:任务类型变更与状态流转 + +**用户故事:** 作为系统,当客户指数变化导致任务类型变更时,系统正确关闭旧任务并创建新任务。 + +#### 验收标准 + +1. WHEN 任务类型从 `priority_recall` 变更为 `high_priority_recall`, THE Task_Generator SHALL 将旧 `priority_recall` 任务标记为 `inactive`(`expires_at` 保持 NULL),创建新的 `high_priority_recall` 任务 +2. WHEN 任务类型从 `follow_up_visit` 变更为 `high_priority_recall` 或 `priority_recall`, THE Task_Generator SHALL 将旧 `follow_up_visit` 任务标记为 `active` 并填充 `expires_at = created_at + 48 小时`,创建新的召回任务 +3. WHEN 任务类型从召回类型变回 `follow_up_visit`, THE Task_Generator SHALL 检查是否存在有 `expires_at` 的旧 `follow_up_visit` 任务,若存在则将旧任务标记为 `inactive`,创建新的 `follow_up_visit` 任务 +4. THE Task_Manager SHALL 在每次状态变更时在 `coach_task_history` 中记录 `action`、`old_status`、`new_status`、`old_task_type`、`new_task_type` + +### 需求 6:召回完成检测 + +**用户故事:** 作为助教,我完成召回任务后(客户到店被服务),系统自动标记任务完成。 + +#### 验收标准 + +1. WHEN ETL 数据更新后, THE Recall_Completion_Detector SHALL 通过 FDW 读取 `fdw_etl.dwd_assistant_service_log` 中的新增服务记录 +2. WHEN 发现某助教为某客户提供了服务, THE Recall_Completion_Detector SHALL 查找该 `(site_id, assistant_id, member_id)` 下所有 `status = 'active'` 的任务 +3. WHEN 匹配到活跃任务, THE Recall_Completion_Detector SHALL 将任务 `status` 设为 `completed`,记录 `completed_at` 为服务时间,记录 `completed_task_type` 为完成时的任务类型 +4. WHEN 召回完成后, THE Recall_Completion_Detector SHALL 触发 `note_reclassify_backfill` 事件,通知 Note_Reclassifier 执行备注回溯 +5. THE Recall_Completion_Detector SHALL 由 `trigger_jobs` 中的 `recall_completion_check` 配置驱动,在 ETL 数据更新事件后触发 + +### 需求 7:备注回溯重分类 + +**用户故事:** 作为系统,当 ETL 数据延迟导致召回完成晚于备注提交时,需要回溯重分类备注。 + +#### 验收标准 + +1. WHEN 召回完成事件触发后, THE Note_Reclassifier SHALL 查找该 `(site_id, assistant_id, member_id)` 在召回服务结束时间之后提交的第一条 `type = 'normal'` 的备注 +2. WHEN 找到符合条件的普通备注, THE Note_Reclassifier SHALL 将该备注的 `type` 从 `normal` 更新为 `follow_up` +3. WHEN 备注重分类完成后, THE Note_Reclassifier SHALL 触发 AI 应用 6 对该备注进行含金量评分(评分逻辑由 P5 AI 集成层实现,本 SPEC 仅定义触发接口) +4. WHEN AI 应用 6 返回评分 >= 6, THE Note_Reclassifier SHALL 生成一条 `follow_up_visit` 任务并标记为 `completed`(回溯完成) +5. WHEN AI 应用 6 返回评分 < 6, THE Note_Reclassifier SHALL 生成一条 `follow_up_visit` 任务,`status` 为 `active`(回访未完成,需助教重新备注) + +### 需求 8:任务 CRUD API + +**用户故事:** 作为助教,我可以查看任务列表、置顶/放弃任务、取消置顶/取消放弃。 + +#### 验收标准 + +1. WHEN 助教请求任务列表, THE Task_Manager SHALL 返回该助教在当前 `site_id` 下所有 `status = 'active'` 的任务,按 `is_pinned DESC, priority_score DESC, created_at ASC` 排序 +2. WHEN 助教请求任务列表, THE Task_Manager SHALL 在每条任务中包含客户基本信息(通过 FDW 读取 `dim_member`)、RS 指数(通过 FDW 读取 `dws_member_assistant_relation_index`)、爱心 icon 档位(💖>8.5 / 🧡>7 / 💛>5 / 💙<5) +3. WHEN 助教置顶某任务, THE Task_Manager SHALL 将该任务的 `is_pinned` 设为 TRUE,并在 `coach_task_history` 中记录 +4. WHEN 助教放弃某任务, THE Task_Manager SHALL 将该任务 `status` 设为 `abandoned`,记录 `abandon_reason`(必填),并在 `coach_task_history` 中记录 +5. WHEN 助教取消置顶某任务, THE Task_Manager SHALL 将该任务的 `is_pinned` 设为 FALSE +6. WHEN 助教取消放弃某任务, THE Task_Manager SHALL 将该任务 `status` 恢复为 `active`,清空 `abandon_reason` +7. IF 助教放弃任务时未提供 `abandon_reason`, THEN THE Task_Manager SHALL 返回 HTTP 422 错误 +8. THE Task_Manager SHALL 通过 Permission_Middleware 验证用户身份,仅允许操作自己的任务 + +### 需求 9:备注 CRUD API + +**用户故事:** 作为助教,我给客户添加备注后,系统正确存储备注内容和星星评分。 + +#### 验收标准 + +1. WHEN 助教创建备注时, THE Note_Service SHALL 在 `biz.notes` 表中创建记录,包含 `site_id`、`user_id`、`target_type`('member')、`target_id`(member_id)、`type`、`content`、可选的 `rating_service_willingness`(1-5)、可选的 `rating_revisit_likelihood`(1-5)、可选的 `task_id` +2. WHEN 备注关联的任务类型为 `follow_up_visit`, THE Note_Service SHALL 将备注 `type` 自动设为 `follow_up` +3. WHEN 备注关联的任务类型不是 `follow_up_visit`, THE Note_Service SHALL 将备注 `type` 设为 `normal` +4. WHEN 备注创建成功且 `type = 'follow_up'`, THE Note_Service SHALL 触发 AI 应用 6 备注分析接口(由 P5 实现),传入备注内容和客户信息 +5. WHEN AI 应用 6 返回评分 >= 6 且备注关联的 `follow_up_visit` 任务 `status = 'active'`, THE Note_Service SHALL 将该任务标记为 `completed` +6. WHEN 助教查询某客户的备注列表, THE Note_Service SHALL 返回该客户在当前 `site_id` 下的所有备注,按 `created_at DESC` 排序,包含星星评分和 AI 评分 +7. WHEN 助教删除备注, THE Note_Service SHALL 执行软删除或硬删除(根据业务需要),删除前需二次确认(前端实现) +8. IF 星星评分值不在 1-5 范围内, THEN THE Note_Service SHALL 返回 HTTP 422 错误 +9. THE Note_Service 的星星评分 SHALL 不参与回访完成判定(完成判定仅看 AI 应用 6 评分 >= 6),不参与 AI 应用 6 分析,仅作辅助数据存储 + +### 需求 10:触发器调度框架 + +**用户故事:** 作为系统,我需要一个统一的触发器调度框架,支持定时、间隔、事件驱动三种触发方式。 + +#### 验收标准 + +1. THE Trigger_Scheduler SHALL 支持 `cron` 类型触发器,按 cron 表达式计算下次运行时间 +2. THE Trigger_Scheduler SHALL 支持 `interval` 类型触发器,按固定间隔秒数计算下次运行时间 +3. THE Trigger_Scheduler SHALL 支持 `event` 类型触发器,在指定事件发生时立即执行 +4. WHEN 触发器执行完成, THE Trigger_Scheduler SHALL 更新 `trigger_jobs` 表中的 `last_run_at` 和 `next_run_at` +5. WHEN 触发器 `status = 'disabled'`, THE Trigger_Scheduler SHALL 跳过该触发器不执行 +6. THE Trigger_Scheduler SHALL 提供 `fire_event(event_name, payload)` 方法,用于触发事件驱动型任务 +7. IF 触发器执行过程中发生错误, THEN THE Trigger_Scheduler SHALL 记录错误日志但不中断其他触发器的执行 + +### 需求 11:迁移脚本管理 + +**用户故事:** 作为后端开发者,我需要所有数据库变更都有对应的迁移脚本,以便变更可追溯、可重放。 + +#### 验收标准 + +1. THE Migration_Script SHALL 将所有业务表的 DDL 存放在 `db/zqyy_app/migrations/` 目录中 +2. THE Migration_Script SHALL 使用日期前缀命名(格式:`YYYY-MM-DD__<描述>.sql`) +3. THE Migration_Script SHALL 使用 UTF-8 编码,纯 SQL(非 ORM) +4. THE Migration_Script SHALL 在每个脚本中包含回滚语句(以注释形式) +5. THE Migration_Script SHALL 使用幂等语法(`IF NOT EXISTS`、`ON CONFLICT DO NOTHING`),确保重复执行不会报错 + +### 需求 12:DDL 测试库落库与文档同步 + +**用户故事:** 作为后端开发者,我需要所有 DDL 变更在测试库中实际执行验证,并同步更新数据库手册和 DDL 基线。 + +#### 验收标准 + +1. WHEN 迁移脚本编写完成, THE Task_Manager SHALL 在 `test_zqyy_app` 测试库中执行迁移脚本,验证无错误 +2. WHEN 迁移脚本执行成功, THE Task_Manager SHALL 创建或更新 `docs/database/BD_Manual_biz_tables.md` 数据库手册,包含变更说明、兼容性影响、回滚策略、验证 SQL(至少 3 条) +3. WHEN 迁移脚本执行成功, THE Task_Manager SHALL 运行 `python scripts/ops/gen_consolidated_ddl.py` 重新生成 DDL 基线文件 +4. WHEN 种子数据脚本执行成功, THE Task_Manager SHALL 在数据库手册中记录种子数据内容(触发器配置) + +### 需求 13:小程序前端页面原型还原(强制) + +**用户故事:** 作为产品经理,我需要小程序前端页面严格忠于 `docs/h5_ui/pages/` 中的 H5 原型图结构和视觉细节,确保最终实现与设计稿高度一致。 + +#### 原型图索引 + +| 原型文件 | 对应小程序页面 | 说明 | +|---------|--------------|------| +| `docs/h5_ui/pages/task-list.html` | `pages/task-list/task-list` | 任务列表页(首页),含业绩进度卡片、置顶/一般/已放弃三区域 | +| `docs/h5_ui/pages/task-detail.html` | `pages/task-detail/task-detail` | 任务详情页 - 高优先召回(theme-red Banner) | +| `docs/h5_ui/pages/task-detail-priority.html` | `pages/task-detail/task-detail` | 任务详情页 - 优先召回(theme-orange Banner) | +| `docs/h5_ui/pages/task-detail-relationship.html` | `pages/task-detail/task-detail` | 任务详情页 - 关系构建(theme-pink Banner) | +| `docs/h5_ui/pages/task-detail-callback.html` | `pages/task-detail/task-detail` | 任务详情页 - 客户回访(theme-teal Banner) | +| `docs/h5_ui/pages/notes.html` | `pages/notes/notes` | 备注记录页 | +| `docs/h5_ui/pages/customer-detail.html` | `pages/customer-detail/customer-detail` | 客户详情页 | + +#### 验收标准 + +##### 13.A 结构还原(强制) + +1. WHEN 实现任务列表页时, THE 小程序页面 SHALL 严格还原原型图中的以下结构层次:顶部用户信息区(头像 + 姓名 + 角色标签 + 门店名)→ 业绩进度卡片(5 段档位进度条 + 课时数据含红戳 + 奖金激励 + 预计收入)→ 任务列表区(📌 置顶区 / 一般任务区 / 已放弃区三个分区,每个分区有标签 + 计数) +2. WHEN 实现任务卡片时, THE 每张任务卡片 SHALL 包含原型图中的全部元素:左侧 4px 彩色边框(高优先=红、优先=橙、关系构建=粉、客户回访=青)、任务类型标签(渐变色圆角矩形)、客户姓名、爱心 icon(💖/🧡/💛/💙)、备注指示器(📝)、描述行(最近到店 + 余额)、AI 建议行(含 AI 机器人 icon)、右侧箭头 +3. WHEN 实现任务详情页时, THE 页面 SHALL 严格还原原型图中的以下模块顺序:通栏 Banner(导航栏 + 客户信息 + 放弃按钮)→ 维客线索卡片(客户基础/消费习惯/玩法偏好/重要反馈,每条含大类标签 + 摘要 + 详情 + 来源标注)→ 与我的关系卡片(爱心档位标签 + 进度条 + RS 分数 + 描述 + 近期服务记录列表)→ 任务建议卡片(建议执行 + 话术参考含复制按钮)→ 我给 TA 的备注卡片(备注列表含星星评分 + 删除按钮)→ 底部操作栏(问问助手 + 备注两个按钮) +4. WHEN 实现备注弹窗时, THE 弹窗 SHALL 包含原型图中的全部元素:标题行(添加备注 + 展开评价按钮)、可折叠的星星评分区(再次服务意愿 1-5 星 + 再来店可能性 1-5 星,各含文字提示)、文本输入区、保存按钮 +5. WHEN 实现长按上下文菜单时, THE 菜单 SHALL 还原原型图中的交互:遮罩层 + 圆角菜单面板(置顶/取消置顶、备注、放弃/取消放弃等选项) +6. WHEN 实现备注记录页时, THE 页面 SHALL 还原原型图中的列表结构:每条备注含内容文本 + 底部标签(助教/客户类型标签 + 时间戳) + +##### 13.B 视觉还原(强制) + +1. THE 小程序页面 SHALL 使用与原型图一致的 TDesign 色彩体系:primary=#0052d9、success=#00a870、warning=#ed7b2f、error=#e34d59,灰阶色板 gray-1(#f3f3f3) 至 gray-13(#242424) +2. THE 任务详情页 Banner SHALL 根据任务类型使用不同主题色:高优先召回=theme-red、优先召回=theme-orange、关系构建=theme-pink、客户回访=theme-teal,与原型图中的渐变背景一致 +3. THE 维客线索大类标签 SHALL 使用原型图中的配色方案:客户基础=primary/10 底色 + primary 文字、消费习惯=success/10 底色 + success 文字、玩法偏好=purple-500/10 底色 + purple-600 文字、重要反馈=error/10 底色 + error 文字 +4. THE 星星评分组件 SHALL 还原原型图中的视觉效果:填充星/空心星 SVG、支持半星显示(用于展示 AI 评分映射) +5. THE 业绩进度卡片 SHALL 还原原型图中的 5 段档位进度条(按比例宽度:0-100 占 45.45%、100-130/130-160/160-190/190-220 各占 13.64%)、红戳动画(盖戳效果)、奖金金额突出样式 + +##### 13.C WXML/WXSS 技术规范(强制) + +1. THE 小程序页面 SHALL 使用 WXML 语法而非 HTML 语法:`` 替代 `
`、`` 替代 ``/`

`、`` 替代 ``、`` 替代 ``,禁止使用 HTML 标签 +2. THE 小程序样式 SHALL 使用 WXSS 语法:使用 `rpx` 单位替代 `px`(750rpx = 屏幕宽度)、使用 `@import` 导入公共样式、禁止使用 `rem`/`em`/`vw`/`vh` 等 CSS 单位 +3. THE 小程序页面 SHALL 使用 `wx:for` 替代 JavaScript 循环渲染、`wx:if`/`wx:elif`/`wx:else` 替代条件渲染、`bind:tap` 替代 `onclick`、`data-*` + `e.currentTarget.dataset` 替代 DOM 操作 +4. THE 小程序页面 SHALL 禁止使用以下 Web 特性:`document.*`、`window.*`、`localStorage`(用 `wx.setStorageSync`)、`fetch`/`XMLHttpRequest`(用 `wx.request`)、CSS `position: fixed` 的 `bottom: 0` 底部栏(用小程序安全区域适配) +5. THE 小程序样式 SHALL 仅使用小程序支持的 CSS 选择器:`.class`、`#id`、`element`、`element, element`、`::after`、`::before`,禁止使用 `>`(子选择器)、`+`(相邻兄弟)、`~`(通用兄弟)、`[attr]`(属性选择器)等不支持的选择器 +6. THE 小程序页面 SHALL 使用 `` 标签作为无渲染包裹容器(替代 HTML 的 `