chore: 迁移项目路径 C:\NeoZQYY → C:\Project\NeoZQYY

开发环境从旧虚拟机 (DESKTOP-KGB0K5G) 迁移到新机器 (DESKTOP-D676QDA),
项目目录从 C:\NeoZQYY 变更为 C:\Project\NeoZQYY,
批量替换 126 个文件中的绝对路径引用。

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
Neo
2026-04-10 06:45:29 +08:00
parent f9b1039970
commit 66c9ae8738
126 changed files with 4154 additions and 4127 deletions

View File

@@ -15,8 +15,8 @@ CONTEXT TRANSFER: We are continuing a conversation that had gotten too long. Her
The user requested running the full dataflow structure analysis pipeline for the feiqiu (飞球) connector. This involved two existing scripts that were already fully implemented.
**Phase 1 — Data Collection** (`analyze_dataflow.py`):
- Successfully ran `C:\ProgramData\miniconda3\python.exe C:\NeoZQYY\scripts\ops\analyze_dataflow.py`
- Had shell working directory issues (kept defaulting to `C:\NeoZQYY\apps\etl\connectors\feiqiu`), resolved by using absolute paths for both Python and script
- Successfully ran `C:\ProgramData\miniconda3\python.exe C:\Project\NeoZQYY\scripts\ops\analyze_dataflow.py`
- Had shell working directory issues (kept defaulting to `C:\Project\NeoZQYY\apps\etl\connectors\feiqiu`), resolved by using absolute paths for both Python and script
- Script performed 3 rounds of adaptive date expansion (30→60→90 days) for 11 tables with insufficient records
- Final date range: 2025-11-22 ~ 2026-02-20
- Results: 23 tables, all successful, 3395 total records
@@ -28,7 +28,7 @@ The user requested running the full dataflow structure analysis pipeline for the
- `collection_manifest.json` — with json_field_count, date_from, date_to
**Phase 2 — Report Generation** (`gen_dataflow_report.py`):
- Successfully ran `C:\ProgramData\miniconda3\python.exe C:\NeoZQYY\scripts\ops\gen_dataflow_report.py`
- Successfully ran `C:\ProgramData\miniconda3\python.exe C:\Project\NeoZQYY\scripts\ops\gen_dataflow_report.py`
- Output: `export/SYSTEM/REPORTS/dataflow_analysis/dataflow_2026-02-20_002258.md` (568.6 KB)
- Report confirmed to contain all required enhanced content:
- Report header with API date range and JSON data volume
@@ -61,7 +61,7 @@ The user requested running the full dataflow structure analysis pipeline for the
- Python 3.10+, uv workspace, PostgreSQL (4 databases: etl_feiqiu, test_etl_feiqiu, zqyy_app, test_zqyy_app)
- All output paths via `.env` environment variables → `export/` directory tree
- Scripts in `scripts/ops/` use `_env_paths.get_output_path()` for path resolution
- Shell quirk: PowerShell working directory often stuck at `C:\NeoZQYY\apps\etl\connectors\feiqiu`; use absolute paths for Python executable and script paths
- Shell quirk: PowerShell working directory often stuck at `C:\Project\NeoZQYY\apps\etl\connectors\feiqiu`; use absolute paths for Python executable and script paths
**Existing specs**: `.kiro/specs/dataflow-structure-audit/` has completed requirements.md, design.md, tasks.md (all tasks marked done)