Compare commits
9 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
579ff46a7f | ||
|
|
35c5600b36 | ||
|
|
c599897142 | ||
|
|
c964d66e64 | ||
|
|
f0e6bba7e9 | ||
|
|
61d7e493bd | ||
|
|
f930c79b34 | ||
|
|
a0a57e0969 | ||
|
|
b3003c4858 |
7
.gitignore
vendored
7
.gitignore
vendored
@@ -75,7 +75,12 @@ Network Trash Folder
|
||||
Temporary Items
|
||||
.apdisk
|
||||
|
||||
# Release artifacts (binaries, archives, checksums), but DO track releases/memory/ for changelog
|
||||
# Release artifacts (binaries, archives, checksums), but keep markdown notes tracked
|
||||
releases/*
|
||||
!releases/README.md
|
||||
!releases/memory/
|
||||
!releases/memory/**
|
||||
!releases/**/
|
||||
releases/**/*
|
||||
!releases/README.md
|
||||
!releases/*/RELEASE_NOTES.md
|
||||
|
||||
77
README.md
77
README.md
@@ -1,66 +1,53 @@
|
||||
# QuoteForge
|
||||
|
||||
**Корпоративный конфигуратор серверов и расчёт КП**
|
||||
Local-first desktop web app for server configuration, quotation, and project work.
|
||||
|
||||
Offline-first архитектура: пользовательские операции через локальную SQLite, MariaDB только для синхронизации.
|
||||
Runtime model:
|
||||
- user work is stored in local SQLite;
|
||||
- MariaDB is used only for setup checks and background sync;
|
||||
- HTTP server binds to loopback only.
|
||||
|
||||

|
||||

|
||||

|
||||
## What the app does
|
||||
|
||||
---
|
||||
- configuration editor with price refresh from synced pricelists;
|
||||
- projects with variants and ordered configurations;
|
||||
- vendor BOM import and PN -> LOT resolution;
|
||||
- revision history with rollback;
|
||||
- rotating local backups.
|
||||
|
||||
## Документация
|
||||
|
||||
Полная архитектурная документация хранится в **[bible/](bible/README.md)**:
|
||||
|
||||
| Файл | Тема |
|
||||
|------|------|
|
||||
| [bible/01-overview.md](bible/01-overview.md) | Продукт, возможности, технологии, структура репо |
|
||||
| [bible/02-architecture.md](bible/02-architecture.md) | Local-first, sync, ценообразование, версионность |
|
||||
| [bible/03-database.md](bible/03-database.md) | SQLite и MariaDB схемы, права, миграции |
|
||||
| [bible/04-api.md](bible/04-api.md) | Все API endpoints и web-маршруты |
|
||||
| [bible/05-config.md](bible/05-config.md) | Конфигурация, env vars, установка |
|
||||
| [bible/06-backup.md](bible/06-backup.md) | Резервное копирование |
|
||||
| [bible/07-dev.md](bible/07-dev.md) | Команды разработки, стиль кода, guardrails |
|
||||
|
||||
---
|
||||
|
||||
## Быстрый старт
|
||||
## Run
|
||||
|
||||
```bash
|
||||
# Применить миграции
|
||||
go run ./cmd/qfs -migrate
|
||||
|
||||
# Запустить
|
||||
go run ./cmd/qfs
|
||||
# или
|
||||
make run
|
||||
```
|
||||
|
||||
Приложение: http://localhost:8080 → откроется `/setup` для настройки подключения к MariaDB.
|
||||
Useful commands:
|
||||
|
||||
```bash
|
||||
# Сборка
|
||||
go run ./cmd/qfs -migrate
|
||||
go test ./...
|
||||
go vet ./...
|
||||
make build-release
|
||||
|
||||
# Проверка
|
||||
go build ./cmd/qfs && go vet ./...
|
||||
```
|
||||
|
||||
---
|
||||
On first run the app creates a minimal `config.yaml`, starts on `http://127.0.0.1:8080`, and opens `/setup` if DB credentials were not saved yet.
|
||||
|
||||
## Releases & Changelog
|
||||
## Documentation
|
||||
|
||||
Changelog между версиями: `releases/memory/v{major}.{minor}.{patch}.md`
|
||||
- Shared engineering rules: [bible/README.md](bible/README.md)
|
||||
- Project architecture: [bible-local/README.md](bible-local/README.md)
|
||||
- Release notes: `releases/<version>/RELEASE_NOTES.md`
|
||||
|
||||
---
|
||||
`bible-local/` is the source of truth for QuoteForge-specific architecture. If code changes behavior, update the matching file there in the same commit.
|
||||
|
||||
## Поддержка
|
||||
## Repository map
|
||||
|
||||
- Email: mike@mchus.pro
|
||||
- Internal: @mchus
|
||||
|
||||
## Лицензия
|
||||
|
||||
Собственность компании, только для внутреннего использования. См. [LICENSE](LICENSE).
|
||||
```text
|
||||
cmd/ entry points and migration tools
|
||||
internal/ application code
|
||||
web/ templates and static assets
|
||||
bible/ shared engineering rules
|
||||
bible-local/ project architecture and contracts
|
||||
releases/ packaged release artifacts and release notes
|
||||
config.example.yaml runtime config reference
|
||||
```
|
||||
|
||||
@@ -1,130 +1,70 @@
|
||||
# 01 — Product Overview
|
||||
# 01 - Overview
|
||||
|
||||
## What is QuoteForge
|
||||
## Product
|
||||
|
||||
A corporate server configuration and quotation tool.
|
||||
Operates in **strict local-first** mode: all user operations go through local SQLite; MariaDB is used only by synchronization and dedicated setup/migration tooling.
|
||||
QuoteForge is a local-first tool for server configuration, quotation, and project tracking.
|
||||
|
||||
---
|
||||
Core user flows:
|
||||
- create and edit configurations locally;
|
||||
- calculate prices from synced pricelists;
|
||||
- group configurations into projects and variants;
|
||||
- import vendor workspaces and map vendor PNs to internal LOTs;
|
||||
- review revision history and roll back safely.
|
||||
|
||||
## Features
|
||||
## Runtime model
|
||||
|
||||
### For Users
|
||||
- Mobile-first interface — works comfortably on phones and tablets
|
||||
- Server configurator — step-by-step component selection
|
||||
- Automatic price calculation — based on pricelists from local cache
|
||||
- CSV export — ready-to-use specifications for clients
|
||||
- Configuration history — versioned snapshots with rollback support
|
||||
- Full offline operation — continue working without network, sync later
|
||||
- Guarded synchronization — sync is blocked by preflight check if local schema is not ready
|
||||
QuoteForge is a single-user thick client.
|
||||
|
||||
### Local Client Security Model
|
||||
Rules:
|
||||
- runtime HTTP binds to loopback only;
|
||||
- browser requests are treated as part of the same local user session;
|
||||
- MariaDB is not a live dependency for normal CRUD;
|
||||
- if non-loopback deployment is ever introduced, auth/RBAC must be added first.
|
||||
|
||||
QuoteForge is currently a **single-user thick client** bound to `localhost`.
|
||||
## Product scope
|
||||
|
||||
- The local HTTP/UI layer is not treated as a multi-user security boundary.
|
||||
- RBAC is not part of the active product contract for the local client.
|
||||
- The authoritative authentication boundary is the remote sync server and its DB credentials captured during setup.
|
||||
- If the app is ever exposed beyond `localhost`, auth/RBAC must be reintroduced as an enforced perimeter before release.
|
||||
In scope:
|
||||
- configurator and quote calculation;
|
||||
- projects, variants, and configuration ordering;
|
||||
- local revision history;
|
||||
- read-only pricelist browsing from SQLite cache;
|
||||
- background sync with MariaDB;
|
||||
- rotating local backups.
|
||||
|
||||
### Price Freshness Indicators
|
||||
Out of scope and intentionally removed:
|
||||
- admin pricing UI/API;
|
||||
- alerts and notification workflows;
|
||||
- stock import tooling;
|
||||
- cron jobs and importer utilities.
|
||||
|
||||
| Color | Status | Condition |
|
||||
|-------|--------|-----------|
|
||||
| Green | Fresh | < 30 days, ≥ 3 sources |
|
||||
| Yellow | Normal | 30–60 days |
|
||||
| Orange | Aging | 60–90 days |
|
||||
| Red | Stale | > 90 days or no data |
|
||||
|
||||
---
|
||||
|
||||
## Tech Stack
|
||||
## Tech stack
|
||||
|
||||
| Layer | Stack |
|
||||
|-------|-------|
|
||||
| Backend | Go 1.22+, Gin, GORM |
|
||||
| Frontend | HTML, Tailwind CSS, htmx |
|
||||
| Local DB | SQLite (`qfs.db`) |
|
||||
| Server DB | MariaDB 11+ (sync transport only for app runtime) |
|
||||
| Export | encoding/csv, excelize (XLSX) |
|
||||
| --- | --- |
|
||||
| Backend | Go, Gin, GORM |
|
||||
| Frontend | HTML templates, htmx, Tailwind CSS |
|
||||
| Local storage | SQLite |
|
||||
| Sync transport | MariaDB |
|
||||
| Export | CSV and XLSX generation |
|
||||
|
||||
---
|
||||
|
||||
## Product Scope
|
||||
|
||||
**In scope:**
|
||||
- Component configurator and quotation calculation
|
||||
- Projects and configurations
|
||||
- Read-only pricelist viewing from local cache
|
||||
- Sync (pull components/pricelists, push local changes)
|
||||
|
||||
**Out of scope (removed intentionally — do not restore):**
|
||||
- Admin pricing UI/API
|
||||
- Stock import
|
||||
- Alerts
|
||||
- Cron/importer utilities
|
||||
|
||||
---
|
||||
|
||||
## Repository Structure
|
||||
## Repository map
|
||||
|
||||
```text
|
||||
cmd/
|
||||
qfs/ main HTTP runtime
|
||||
migrate/ server migration tool
|
||||
migrate_ops_projects/ OPS project migration helper
|
||||
internal/
|
||||
appstate/ backup and runtime state
|
||||
config/ runtime config parsing
|
||||
handlers/ HTTP handlers
|
||||
localdb/ SQLite models and migrations
|
||||
repository/ repositories
|
||||
services/ business logic and sync
|
||||
web/
|
||||
templates/ HTML templates
|
||||
static/ static assets
|
||||
bible/ shared engineering rules
|
||||
bible-local/ project-specific architecture
|
||||
releases/ release artifacts and notes
|
||||
```
|
||||
quoteforge/
|
||||
├── cmd/
|
||||
│ ├── qfs/main.go # HTTP server entry point
|
||||
│ ├── migrate/ # Migration tool
|
||||
│ └── migrate_ops_projects/ # OPS project migrator
|
||||
├── internal/
|
||||
│ ├── appmeta/ # App version metadata
|
||||
│ ├── appstate/ # State management, backup
|
||||
│ ├── article/ # Article generation
|
||||
│ ├── config/ # Config parsing
|
||||
│ ├── db/ # DB initialization
|
||||
│ ├── handlers/ # HTTP handlers
|
||||
│ ├── localdb/ # SQLite layer
|
||||
│ ├── middleware/ # Auth, CORS, etc.
|
||||
│ ├── models/ # GORM models
|
||||
│ ├── repository/ # Repository layer
|
||||
│ └── services/ # Business logic
|
||||
├── web/
|
||||
│ ├── templates/ # HTML templates + partials
|
||||
│ └── static/ # CSS, JS, assets
|
||||
├── migrations/ # SQL migration files (30+)
|
||||
├── bible/ # Architectural documentation (this section)
|
||||
├── releases/memory/ # Per-version changelogs
|
||||
├── config.example.yaml # Config template (the only one in repo)
|
||||
└── go.mod
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Integration with Existing DB
|
||||
|
||||
QuoteForge integrates with the existing `RFQ_LOG` database.
|
||||
|
||||
Hard boundary:
|
||||
|
||||
- normal runtime HTTP handlers, UI flows, pricing, export, BOM resolution, and project/config CRUD must use SQLite only;
|
||||
- MariaDB access is allowed only inside `internal/services/sync/*` and dedicated setup/migration tools under `cmd/`;
|
||||
- any new direct MariaDB query in non-sync runtime code is an architectural violation.
|
||||
|
||||
**Read-only:**
|
||||
- `lot` — component catalog
|
||||
- `qt_lot_metadata` — extended component data
|
||||
- `qt_categories` — categories
|
||||
- `qt_pricelists`, `qt_pricelist_items` — pricelists
|
||||
- `stock_log` — stock quantities consumed during sync enrichment
|
||||
- `qt_partnumber_books`, `qt_partnumber_book_items` — partnumber book snapshots consumed during sync pull
|
||||
|
||||
**Read + Write:**
|
||||
- `qt_configurations` — configurations
|
||||
- `qt_projects` — projects
|
||||
|
||||
**Sync service tables:**
|
||||
- `qt_client_schema_state` — applied migrations state and operational client status per device (`username + hostname`)
|
||||
Fields written by QuoteForge:
|
||||
`app_version`, `last_sync_at`, `last_sync_status`,
|
||||
`pending_changes_count`, `pending_errors_count`, `configurations_count`, `projects_count`,
|
||||
`estimate_pricelist_version`, `warehouse_pricelist_version`, `competitor_pricelist_version`,
|
||||
`last_sync_error_code`, `last_sync_error_text`, `last_checked_at`, `updated_at`
|
||||
- `qt_pricelist_sync_status` — pricelist sync status
|
||||
|
||||
@@ -1,251 +1,75 @@
|
||||
# 02 — Architecture
|
||||
# 02 - Architecture
|
||||
|
||||
## Local-First Principle
|
||||
## Local-first rule
|
||||
|
||||
**SQLite** is the single source of truth for the user.
|
||||
**MariaDB** is a sync server only — it never blocks local operations.
|
||||
SQLite is the runtime source of truth.
|
||||
MariaDB is sync transport plus setup and migration tooling.
|
||||
|
||||
```
|
||||
User
|
||||
│
|
||||
▼
|
||||
SQLite (qfs.db) ← all CRUD operations go here
|
||||
│
|
||||
│ background sync (every 5 min)
|
||||
▼
|
||||
MariaDB (RFQ_LOG) ← pull/push only
|
||||
```text
|
||||
browser -> Gin handlers -> SQLite
|
||||
-> pending_changes
|
||||
background sync <------> MariaDB
|
||||
```
|
||||
|
||||
**Rules:**
|
||||
- All CRUD operations go through SQLite only
|
||||
- If MariaDB is unavailable → local work continues without restrictions
|
||||
- Changes are queued in `pending_changes` and pushed on next sync
|
||||
Rules:
|
||||
- user CRUD must continue when MariaDB is offline;
|
||||
- runtime handlers and pages must read and write SQLite only;
|
||||
- MariaDB access in runtime code is allowed only inside sync and setup flows;
|
||||
- no live MariaDB fallback for reads that already exist in local cache.
|
||||
|
||||
## MariaDB Boundary
|
||||
## Sync contract
|
||||
|
||||
MariaDB is not part of the runtime read/write path for user features.
|
||||
Bidirectional:
|
||||
- projects;
|
||||
- configurations;
|
||||
- `vendor_spec`;
|
||||
- pending change metadata.
|
||||
|
||||
Hard rules:
|
||||
Pull-only:
|
||||
- components;
|
||||
- pricelists and pricelist items;
|
||||
- partnumber books and partnumber book items.
|
||||
|
||||
- HTTP handlers, web pages, quote calculation, export, vendor BOM resolution, pricelist browsing, project browsing, and configuration CRUD must read/write SQLite only.
|
||||
- MariaDB access from the app runtime is allowed only inside the sync subsystem (`internal/services/sync/*`) for explicit pull/push work.
|
||||
- Dedicated tooling under `cmd/migrate` and `cmd/migrate_ops_projects` may access MariaDB for operator-run schema/data migration tasks.
|
||||
- Setup may test/store connection settings, but after setup the application must treat MariaDB as sync transport only.
|
||||
- Any new repository/service/handler that issues MariaDB queries outside sync is a regression and must be rejected in review.
|
||||
- Local SQLite migrations are code-defined only (`AutoMigrate` + `runLocalMigrations`); there is no server-driven client migration registry.
|
||||
- Read-only local sync caches are disposable. If a local cache table cannot be migrated safely at startup, the client may quarantine/reset that cache and continue booting.
|
||||
Readiness guard:
|
||||
- every sync push/pull runs a preflight check;
|
||||
- blocked sync returns `423 Locked` with a machine-readable reason;
|
||||
- local work continues even when sync is blocked.
|
||||
- sync metadata updates must preserve project `updated_at`; sync time belongs in `synced_at`, not in the user-facing last-modified timestamp.
|
||||
|
||||
Forbidden patterns:
|
||||
## Pricing contract
|
||||
|
||||
- calling `connMgr.GetDB()` from non-sync runtime business code;
|
||||
- constructing MariaDB-backed repositories in handlers for normal user requests;
|
||||
- using MariaDB as online fallback for reads when local SQLite already contains the synced dataset;
|
||||
- adding UI/API features that depend on live MariaDB availability.
|
||||
Prices come only from `local_pricelist_items`.
|
||||
|
||||
## Local Client Boundary
|
||||
Rules:
|
||||
- `local_components` is metadata-only;
|
||||
- quote calculation must not read prices from components;
|
||||
- latest pricelist selection ignores snapshots without items;
|
||||
- auto pricelist mode stays auto and must not be persisted as an explicit resolved ID.
|
||||
|
||||
The running app is a localhost-only thick client.
|
||||
## Configuration versioning
|
||||
|
||||
- Browser/UI requests on the local machine are treated as part of the same trusted user session.
|
||||
- Local routes are not modeled as a hardened multi-user API perimeter.
|
||||
- Authorization to the central server happens through the saved MariaDB connection configured during setup.
|
||||
- Any future deployment that binds beyond `127.0.0.1` must add enforced auth/RBAC before exposure.
|
||||
Configuration revisions are append-only snapshots stored in `local_configuration_versions`.
|
||||
|
||||
---
|
||||
Rules:
|
||||
- create a new revision only when spec or price content changes;
|
||||
- rollback creates a new head revision from an old snapshot;
|
||||
- rename, reorder, project move, and similar operational edits do not create a new revision snapshot;
|
||||
- current revision pointer must be recoverable if legacy or damaged rows are found locally.
|
||||
|
||||
## Synchronization
|
||||
## Naming collisions
|
||||
|
||||
### Data Flow Diagram
|
||||
UI-driven rename and copy flows use one suffix convention for conflicts.
|
||||
|
||||
```
|
||||
[ SERVER / MariaDB ]
|
||||
┌───────────────────────────┐
|
||||
│ qt_projects │
|
||||
│ qt_configurations │
|
||||
│ qt_pricelists │
|
||||
│ qt_pricelist_items │
|
||||
│ qt_pricelist_sync_status │
|
||||
└─────────────┬─────────────┘
|
||||
│
|
||||
pull (projects/configs/pricelists)
|
||||
│
|
||||
┌────────────────────┴────────────────────┐
|
||||
│ │
|
||||
[ CLIENT A / SQLite ] [ CLIENT B / SQLite ]
|
||||
local_projects local_projects
|
||||
local_configurations local_configurations
|
||||
local_pricelists local_pricelists
|
||||
local_pricelist_items local_pricelist_items
|
||||
pending_changes pending_changes
|
||||
│ │
|
||||
└────── push (projects/configs only) ─────┘
|
||||
│
|
||||
[ SERVER / MariaDB ]
|
||||
```
|
||||
Rules:
|
||||
- configuration and variant names must auto-resolve collisions with `_копия`, then `_копия2`, `_копия3`, and so on;
|
||||
- copy checkboxes and copy modals must prefill `_копия`, not ` (копия)`;
|
||||
- the literal variant name `main` is reserved and must not be allowed for non-main variants.
|
||||
|
||||
### Sync Direction by Entity
|
||||
## Vendor BOM contract
|
||||
|
||||
| Entity | Direction |
|
||||
|--------|-----------|
|
||||
| Configurations | Client ↔ Server ↔ Other Clients |
|
||||
| Projects | Client ↔ Server ↔ Other Clients |
|
||||
| Pricelists | Server → Clients only (no push) |
|
||||
| Components | Server → Clients only |
|
||||
| Partnumber books | Server → Clients only |
|
||||
Vendor BOM is stored in `vendor_spec` on the configuration row.
|
||||
|
||||
Local pricelists not present on the server and not referenced by active configurations are deleted automatically on sync.
|
||||
|
||||
### Soft Deletes (Archive Pattern)
|
||||
|
||||
Configurations and projects are **never hard-deleted**. Deletion is archive via `is_active = false`.
|
||||
|
||||
- `DELETE /api/configs/:uuid` → sets `is_active = false` (archived); can be restored via `reactivate`
|
||||
- `DELETE /api/projects/:uuid` → archives a project **variant** only (`variant` field must be non-empty); main projects cannot be deleted via this endpoint
|
||||
|
||||
## Sync Readiness Guard
|
||||
|
||||
Before every push/pull, a preflight check runs:
|
||||
1. Is the server (MariaDB) reachable?
|
||||
2. Is the local client schema initialized and writable?
|
||||
|
||||
**If the check fails:**
|
||||
- Local CRUD continues without restriction
|
||||
- Sync API returns `423 Locked` with `reason_code` and `reason_text`
|
||||
- UI shows a red indicator with the block reason
|
||||
|
||||
---
|
||||
|
||||
## Pricing
|
||||
|
||||
### Principle
|
||||
|
||||
**Prices come only from `local_pricelist_items`.**
|
||||
Components (`local_components`) are metadata-only — they contain no pricing information.
|
||||
Stock enrichment for pricelist rows is persisted into `local_pricelist_items` during sync; UI/runtime must not resolve it live from MariaDB.
|
||||
|
||||
### Lookup Pattern
|
||||
|
||||
```go
|
||||
// Look up a price for a line item
|
||||
price, found := s.lookupPriceByPricelistID(pricelistID, lotName)
|
||||
if found && price > 0 {
|
||||
// use price
|
||||
}
|
||||
|
||||
// Inside lookupPriceByPricelistID:
|
||||
localPL, err := s.localDB.GetLocalPricelistByServerID(pricelistID)
|
||||
price, err := s.localDB.GetLocalPriceForLot(localPL.ID, lotName)
|
||||
```
|
||||
|
||||
### Multi-Level Pricelists
|
||||
|
||||
A configuration can reference up to three pricelists simultaneously:
|
||||
|
||||
| Field | Purpose |
|
||||
|-------|---------|
|
||||
| `pricelist_id` | Primary (estimate) |
|
||||
| `warehouse_pricelist_id` | Warehouse pricing |
|
||||
| `competitor_pricelist_id` | Competitor pricing |
|
||||
|
||||
Pricelist sources: `estimate` | `warehouse` | `competitor`
|
||||
|
||||
### "Auto" Pricelist Selection
|
||||
|
||||
Configurator supports explicit and automatic selection per source (`estimate`, `warehouse`, `competitor`):
|
||||
|
||||
- **Explicit mode:** concrete `pricelist_id` is set by user in settings.
|
||||
- **Auto mode:** client sends no explicit ID for that source; backend resolves the current latest active pricelist.
|
||||
|
||||
`auto` must stay `auto` after price-level refresh and after manual "refresh prices":
|
||||
- resolved IDs are runtime-only and must not overwrite user's mode;
|
||||
- switching to explicit selection must clear runtime auto resolution for that source.
|
||||
|
||||
### Latest Pricelist Resolution Rules
|
||||
|
||||
For both server (`qt_pricelists`) and local cache (`local_pricelists`), "latest by source" is resolved with:
|
||||
|
||||
1. only pricelists that have at least one item (`EXISTS ...pricelist_items`);
|
||||
2. deterministic sort: `created_at DESC, id DESC`.
|
||||
|
||||
This prevents selecting empty/incomplete snapshots and removes nondeterministic ties.
|
||||
|
||||
---
|
||||
|
||||
## Configuration Versioning
|
||||
|
||||
### Principle
|
||||
|
||||
Append-only for **spec+price** changes: immutable snapshots are stored in `local_configuration_versions`.
|
||||
|
||||
```
|
||||
local_configurations
|
||||
└── current_version_id ──► local_configuration_versions (v3) ← active
|
||||
local_configuration_versions (v2)
|
||||
local_configuration_versions (v1)
|
||||
```
|
||||
|
||||
- `version_no = max + 1` when configuration **spec+price** changes
|
||||
- Old versions are never modified or deleted in normal flow
|
||||
- Rollback does **not** rewind history — it creates a **new** version from the snapshot
|
||||
- Operational updates (`line_no` reorder, server count, project move, rename)
|
||||
are synced via `pending_changes` but do **not** create a new revision snapshot
|
||||
|
||||
### Rollback
|
||||
|
||||
```bash
|
||||
POST /api/configs/:uuid/rollback
|
||||
{
|
||||
"target_version": 3,
|
||||
"note": "optional comment"
|
||||
}
|
||||
```
|
||||
|
||||
Result:
|
||||
- A new version `vN` is created with `data` from the target version
|
||||
- `change_note = "rollback to v{target_version}"` (+ note if provided)
|
||||
- `current_version_id` is switched to the new version
|
||||
- Configuration moves to `sync_status = pending`
|
||||
|
||||
### Sync Status Flow
|
||||
|
||||
```
|
||||
local → pending → synced
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Project Specification Ordering (`Line`)
|
||||
|
||||
- Each project configuration has persistent `line_no` (`10,20,30...`) in both SQLite and MariaDB.
|
||||
- Project list ordering is deterministic:
|
||||
`line_no ASC`, then `created_at DESC`, then `id DESC`.
|
||||
- Drag-and-drop reorder in project UI updates `line_no` for active project configurations.
|
||||
- Reorder writes are queued as configuration `update` events in `pending_changes`
|
||||
without creating new configuration versions.
|
||||
- Backward compatibility: if remote MariaDB schema does not yet include `line_no`,
|
||||
sync falls back to create/update without `line_no` instead of failing.
|
||||
|
||||
---
|
||||
|
||||
## Sync Payload for Versioning
|
||||
|
||||
Events in `pending_changes` for configurations contain:
|
||||
|
||||
| Field | Description |
|
||||
|-------|-------------|
|
||||
| `configuration_uuid` | Identifier |
|
||||
| `operation` | `create` / `update` / `rollback` |
|
||||
| `current_version_id` | Active version ID |
|
||||
| `current_version_no` | Version number |
|
||||
| `snapshot` | Current configuration state |
|
||||
| `idempotency_key` | For idempotent push |
|
||||
| `conflict_policy` | `last_write_wins` |
|
||||
|
||||
---
|
||||
|
||||
## Background Processes
|
||||
|
||||
| Process | Interval | What it does |
|
||||
|---------|----------|--------------|
|
||||
| Sync worker | 5 min | push pending + pull all |
|
||||
| Backup scheduler | configurable (`backup.time`) | creates ZIP archives |
|
||||
Rules:
|
||||
- PN to LOT resolution uses the active local partnumber book;
|
||||
- canonical persisted mapping is `lot_mappings[]`;
|
||||
- QuoteForge does not use legacy BOM tables such as `qt_bom`, `qt_lot_bundles`, or `qt_lot_bundle_items`.
|
||||
|
||||
@@ -1,267 +1,67 @@
|
||||
# 03 — Database
|
||||
# 03 - Database
|
||||
|
||||
## SQLite (local, client-side)
|
||||
## SQLite
|
||||
|
||||
File: `qfs.db` in the user-state directory (see [05-config.md](05-config.md)).
|
||||
SQLite is the local runtime database.
|
||||
|
||||
### Tables
|
||||
|
||||
#### Components and Reference Data
|
||||
|
||||
| Table | Purpose | Key Fields |
|
||||
|-------|---------|------------|
|
||||
| `local_components` | Component metadata (NO prices) | `lot_name` (PK), `lot_description`, `category`, `model` |
|
||||
| `connection_settings` | MariaDB connection settings | key-value store |
|
||||
| `app_settings` | Application settings | `key` (PK), `value`, `updated_at` |
|
||||
|
||||
Read-only cache contract:
|
||||
|
||||
- `local_components`, `local_pricelists`, `local_pricelist_items`, `local_partnumber_books`, and `local_partnumber_book_items` are synchronized caches, not user-authored data.
|
||||
- Startup must prefer application availability over preserving a broken cache schema.
|
||||
- If one of these tables cannot be migrated safely, the client may quarantine or drop it and recreate it empty; the next sync repopulates it.
|
||||
|
||||
#### Pricelists
|
||||
|
||||
| Table | Purpose | Key Fields |
|
||||
|-------|---------|------------|
|
||||
| `local_pricelists` | Pricelist headers | `id`, `server_id` (unique), `source`, `version`, `created_at` |
|
||||
| `local_pricelist_items` | Pricelist line items ← **sole source of prices** | `id`, `pricelist_id` (FK), `lot_name`, `price`, `lot_category` |
|
||||
|
||||
#### Partnumber Books (PN → LOT mapping, pull-only from PriceForge)
|
||||
|
||||
| Table | Purpose | Key Fields |
|
||||
|-------|---------|------------|
|
||||
| `local_partnumber_books` | Version snapshots of PN→LOT mappings | `id`, `server_id` (unique), `version`, `created_at`, `is_active` |
|
||||
| `local_partnumber_book_items` | Canonical PN catalog rows | `id`, `partnumber`, `lots_json`, `description` |
|
||||
|
||||
Active book: `WHERE is_active=1 ORDER BY created_at DESC, id DESC LIMIT 1`
|
||||
|
||||
#### Configurations and Projects
|
||||
|
||||
| Table | Purpose | Key Fields |
|
||||
|-------|---------|------------|
|
||||
| `local_configurations` | Saved configurations | `id`, `uuid` (unique), `items` (JSON), `vendor_spec` (JSON: PN/qty/description + canonical `lot_mappings[]`), `line_no`, `pricelist_id`, `warehouse_pricelist_id`, `competitor_pricelist_id`, `current_version_id`, `sync_status` |
|
||||
| `local_configuration_versions` | Immutable snapshots (revisions) | `id`, `configuration_id` (FK), `version_no`, `data` (JSON), `change_note`, `created_at` |
|
||||
| `local_projects` | Projects | `id`, `uuid` (unique), `name`, `code`, `sync_status` |
|
||||
|
||||
#### Sync
|
||||
Main tables:
|
||||
|
||||
| Table | Purpose |
|
||||
|-------|---------|
|
||||
| `pending_changes` | Queue of changes to push to MariaDB |
|
||||
| `local_schema_migrations` | Applied migrations (idempotency guard) |
|
||||
| --- | --- |
|
||||
| `local_components` | synced component metadata |
|
||||
| `local_pricelists` | local pricelist headers |
|
||||
| `local_pricelist_items` | local pricelist rows, the only runtime price source |
|
||||
| `local_projects` | user projects |
|
||||
| `local_configurations` | user configurations |
|
||||
| `local_configuration_versions` | immutable revision snapshots |
|
||||
| `local_partnumber_books` | partnumber book headers |
|
||||
| `local_partnumber_book_items` | PN -> LOT catalog payload |
|
||||
| `pending_changes` | sync queue |
|
||||
| `connection_settings` | encrypted MariaDB connection settings |
|
||||
| `app_settings` | local app state |
|
||||
| `local_schema_migrations` | applied local migration markers |
|
||||
|
||||
---
|
||||
Rules:
|
||||
- cache tables may be rebuilt if local migration recovery requires it;
|
||||
- user-authored tables must not be dropped as a recovery shortcut;
|
||||
- `local_pricelist_items` is the only valid runtime source of prices;
|
||||
- configuration `items` and `vendor_spec` are stored as JSON payloads inside configuration rows.
|
||||
|
||||
### Key SQLite Indexes
|
||||
## MariaDB
|
||||
|
||||
```sql
|
||||
-- Pricelists
|
||||
INDEX local_pricelist_items(pricelist_id)
|
||||
UNIQUE INDEX local_pricelists(server_id)
|
||||
INDEX local_pricelists(source, created_at) -- used for "latest by source" queries
|
||||
-- latest-by-source runtime query also applies deterministic tie-break by id DESC
|
||||
MariaDB is the central sync database.
|
||||
|
||||
-- Configurations
|
||||
INDEX local_configurations(pricelist_id)
|
||||
INDEX local_configurations(warehouse_pricelist_id)
|
||||
INDEX local_configurations(competitor_pricelist_id)
|
||||
INDEX local_configurations(project_uuid, line_no) -- project ordering (Line column)
|
||||
UNIQUE INDEX local_configurations(uuid)
|
||||
```
|
||||
Runtime read permissions:
|
||||
- `lot`
|
||||
- `qt_lot_metadata`
|
||||
- `qt_categories`
|
||||
- `qt_pricelists`
|
||||
- `qt_pricelist_items`
|
||||
- `stock_log`
|
||||
- `qt_partnumber_books`
|
||||
- `qt_partnumber_book_items`
|
||||
|
||||
---
|
||||
Runtime read/write permissions:
|
||||
- `qt_projects`
|
||||
- `qt_configurations`
|
||||
- `qt_client_schema_state`
|
||||
- `qt_pricelist_sync_status`
|
||||
|
||||
### `items` JSON Structure in Configurations
|
||||
Insert-only tracking:
|
||||
- `qt_vendor_partnumber_seen`
|
||||
|
||||
```json
|
||||
{
|
||||
"items": [
|
||||
{
|
||||
"lot_name": "CPU_AMD_9654",
|
||||
"quantity": 2,
|
||||
"unit_price": 123456.78,
|
||||
"section": "Processors"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
Prices are stored inside the `items` JSON field and refreshed from the pricelist on configuration refresh.
|
||||
|
||||
---
|
||||
|
||||
## MariaDB (server-side, sync-only)
|
||||
|
||||
Database: `RFQ_LOG`
|
||||
|
||||
### Tables and Permissions
|
||||
|
||||
| Table | Purpose | Permissions |
|
||||
|-------|---------|-------------|
|
||||
| `lot` | Component catalog | SELECT |
|
||||
| `qt_lot_metadata` | Extended component data | SELECT |
|
||||
| `qt_categories` | Component categories | SELECT |
|
||||
| `qt_pricelists` | Pricelists | SELECT |
|
||||
| `qt_pricelist_items` | Pricelist line items | SELECT |
|
||||
| `stock_log` | Latest stock qty by partnumber (pricelist enrichment during sync only) | SELECT |
|
||||
| `qt_configurations` | Saved configurations (includes `line_no`) | SELECT, INSERT, UPDATE |
|
||||
| `qt_projects` | Projects | SELECT, INSERT, UPDATE |
|
||||
| `qt_client_schema_state` | Applied migrations state + client operational status per `username + hostname` | SELECT, INSERT, UPDATE |
|
||||
| `qt_pricelist_sync_status` | Pricelist sync status | SELECT, INSERT, UPDATE |
|
||||
| `qt_partnumber_books` | Partnumber book headers with snapshot membership in `partnumbers_json` (written by PriceForge) | SELECT |
|
||||
| `qt_partnumber_book_items` | Canonical PN catalog with `lots_json` composition (written by PriceForge) | SELECT |
|
||||
| `qt_vendor_partnumber_seen` | Vendor PN tracking for unresolved/ignored BOM rows (`is_ignored`) | INSERT only for new `partnumber`; existing rows must not be modified |
|
||||
|
||||
Legacy server tables not used by QuoteForge runtime anymore:
|
||||
|
||||
- `qt_bom`
|
||||
- `qt_lot_bundles`
|
||||
- `qt_lot_bundle_items`
|
||||
|
||||
QuoteForge canonical BOM storage is:
|
||||
|
||||
- `qt_configurations.vendor_spec`
|
||||
- row-level PN -> multiple LOT decomposition in `vendor_spec[].lot_mappings[]`
|
||||
|
||||
Partnumber book server read contract:
|
||||
|
||||
1. Read active or target book from `qt_partnumber_books`.
|
||||
2. Parse `partnumbers_json`.
|
||||
3. Load payloads from `qt_partnumber_book_items WHERE partnumber IN (...)`.
|
||||
|
||||
Pricelist stock enrichment contract:
|
||||
|
||||
1. Sync pulls base pricelist rows from `qt_pricelist_items`.
|
||||
2. Sync reads latest stock quantities from `stock_log`.
|
||||
3. Sync resolves `partnumber -> lot` through the local mirror of `qt_partnumber_book_items` (`local_partnumber_book_items.lots_json`).
|
||||
4. Sync stores enriched `available_qty` and `partnumbers` into `local_pricelist_items`.
|
||||
|
||||
Runtime rule:
|
||||
|
||||
- pricelist UI and quote logic read only `local_pricelist_items`;
|
||||
- runtime code must not query `stock_log`, `qt_pricelist_items`, or `qt_partnumber_book_items` directly outside sync.
|
||||
|
||||
`qt_partnumber_book_items` no longer contains `book_id` or `lot_name`.
|
||||
It stores one row per `partnumber` with:
|
||||
|
||||
- `partnumber`
|
||||
- `lots_json` as `[{"lot_name":"CPU_X","qty":2}, ...]`
|
||||
- `description`
|
||||
|
||||
`qt_client_schema_state` current contract:
|
||||
|
||||
- identity key: `username + hostname`
|
||||
- client/runtime state:
|
||||
`app_version`, `last_checked_at`, `updated_at`
|
||||
- operational state:
|
||||
`last_sync_at`, `last_sync_status`
|
||||
- queue health:
|
||||
`pending_changes_count`, `pending_errors_count`
|
||||
- local dataset size:
|
||||
`configurations_count`, `projects_count`
|
||||
- price context:
|
||||
`estimate_pricelist_version`, `warehouse_pricelist_version`, `competitor_pricelist_version`
|
||||
- last known sync problem:
|
||||
`last_sync_error_code`, `last_sync_error_text`
|
||||
|
||||
`last_sync_error_*` source priority:
|
||||
|
||||
1. blocked readiness state from `local_sync_guard_state`
|
||||
2. latest non-empty `pending_changes.last_error`
|
||||
3. `NULL` when no known sync problem exists
|
||||
|
||||
### Grant Permissions to Existing User
|
||||
|
||||
```sql
|
||||
GRANT SELECT ON RFQ_LOG.lot TO '<DB_USER>'@'%';
|
||||
GRANT SELECT ON RFQ_LOG.qt_lot_metadata TO '<DB_USER>'@'%';
|
||||
GRANT SELECT ON RFQ_LOG.qt_categories TO '<DB_USER>'@'%';
|
||||
GRANT SELECT ON RFQ_LOG.qt_pricelists TO '<DB_USER>'@'%';
|
||||
GRANT SELECT ON RFQ_LOG.qt_pricelist_items TO '<DB_USER>'@'%';
|
||||
GRANT SELECT ON RFQ_LOG.stock_log TO '<DB_USER>'@'%';
|
||||
|
||||
GRANT SELECT, INSERT, UPDATE ON RFQ_LOG.qt_configurations TO '<DB_USER>'@'%';
|
||||
GRANT SELECT, INSERT, UPDATE ON RFQ_LOG.qt_projects TO '<DB_USER>'@'%';
|
||||
|
||||
GRANT SELECT, INSERT, UPDATE ON RFQ_LOG.qt_client_schema_state TO '<DB_USER>'@'%';
|
||||
GRANT SELECT, INSERT, UPDATE ON RFQ_LOG.qt_pricelist_sync_status TO '<DB_USER>'@'%';
|
||||
|
||||
GRANT SELECT ON RFQ_LOG.qt_partnumber_books TO '<DB_USER>'@'%';
|
||||
GRANT SELECT ON RFQ_LOG.qt_partnumber_book_items TO '<DB_USER>'@'%';
|
||||
GRANT INSERT, UPDATE ON RFQ_LOG.qt_vendor_partnumber_seen TO '<DB_USER>'@'%';
|
||||
|
||||
FLUSH PRIVILEGES;
|
||||
```
|
||||
|
||||
### Create a New User
|
||||
|
||||
```sql
|
||||
CREATE USER IF NOT EXISTS 'quote_user'@'%' IDENTIFIED BY '<DB_PASSWORD>';
|
||||
|
||||
GRANT SELECT ON RFQ_LOG.lot TO 'quote_user'@'%';
|
||||
GRANT SELECT ON RFQ_LOG.qt_lot_metadata TO 'quote_user'@'%';
|
||||
GRANT SELECT ON RFQ_LOG.qt_categories TO 'quote_user'@'%';
|
||||
GRANT SELECT ON RFQ_LOG.qt_pricelists TO 'quote_user'@'%';
|
||||
GRANT SELECT ON RFQ_LOG.qt_pricelist_items TO 'quote_user'@'%';
|
||||
GRANT SELECT ON RFQ_LOG.stock_log TO 'quote_user'@'%';
|
||||
GRANT SELECT, INSERT, UPDATE ON RFQ_LOG.qt_configurations TO 'quote_user'@'%';
|
||||
GRANT SELECT, INSERT, UPDATE ON RFQ_LOG.qt_projects TO 'quote_user'@'%';
|
||||
GRANT SELECT, INSERT, UPDATE ON RFQ_LOG.qt_client_schema_state TO 'quote_user'@'%';
|
||||
GRANT SELECT, INSERT, UPDATE ON RFQ_LOG.qt_pricelist_sync_status TO 'quote_user'@'%';
|
||||
GRANT SELECT ON RFQ_LOG.qt_partnumber_books TO 'quote_user'@'%';
|
||||
GRANT SELECT ON RFQ_LOG.qt_partnumber_book_items TO 'quote_user'@'%';
|
||||
GRANT INSERT, UPDATE ON RFQ_LOG.qt_vendor_partnumber_seen TO 'quote_user'@'%';
|
||||
|
||||
FLUSH PRIVILEGES;
|
||||
SHOW GRANTS FOR 'quote_user'@'%';
|
||||
```
|
||||
|
||||
**Note:** If pricelists sync but stock enrichment is empty, verify `SELECT` on `qt_pricelist_items`, `qt_partnumber_books`, `qt_partnumber_book_items`, and `stock_log`.
|
||||
|
||||
**Note:** If you see `Access denied for user ...@'<ip>'`, check for conflicting user entries (user@localhost vs user@'%').
|
||||
|
||||
---
|
||||
Rules:
|
||||
- QuoteForge runtime must not depend on any removed legacy BOM tables;
|
||||
- stock enrichment happens during sync and is persisted into SQLite;
|
||||
- normal UI requests must not query MariaDB tables directly.
|
||||
|
||||
## Migrations
|
||||
|
||||
### SQLite Migrations (local) — два уровня, выполняются при каждом старте
|
||||
SQLite:
|
||||
- schema creation and additive changes go through GORM `AutoMigrate`;
|
||||
- data fixes, index repair, and one-off rewrites go through `runLocalMigrations`;
|
||||
- local migration state is tracked in `local_schema_migrations`.
|
||||
|
||||
**1. GORM AutoMigrate** (`internal/localdb/localdb.go`) — первый и основной уровень.
|
||||
Список Go-моделей передаётся в `db.AutoMigrate(...)`. GORM создаёт отсутствующие таблицы и добавляет новые колонки. Колонки и таблицы **не удаляет**.
|
||||
|
||||
Local SQLite partnumber book cache contract:
|
||||
|
||||
- `local_partnumber_books.partnumbers_json` stores PN membership for a pulled book.
|
||||
- `local_partnumber_book_items` is a deduplicated local catalog by `partnumber`.
|
||||
- `local_partnumber_book_items.lots_json` mirrors the server `lots_json` payload.
|
||||
- SQLite migration `2026_03_07_local_partnumber_book_catalog` rebuilds old `book_id + lot_name` rows into the new local cache shape.
|
||||
→ Для добавления новой таблицы или колонки достаточно добавить модель/поле и включить модель в AutoMigrate.
|
||||
|
||||
**2. `runLocalMigrations`** (`internal/localdb/migrations.go`) — второй уровень, для операций которые AutoMigrate не умеет: backfill данных, пересоздание таблиц, создание индексов.
|
||||
Каждая функция выполняется один раз — идемпотентность через запись `id` в `local_schema_migrations`.
|
||||
|
||||
QuoteForge does not use centralized server-driven SQLite migrations.
|
||||
All local SQLite schema/data migrations live in the client codebase.
|
||||
|
||||
### MariaDB Migrations (server-side)
|
||||
|
||||
- Stored in `migrations/` (SQL files)
|
||||
- Applied via `-migrate` flag
|
||||
- `min_app_version` — minimum app version required for the migration
|
||||
|
||||
---
|
||||
|
||||
## DB Debugging
|
||||
|
||||
```bash
|
||||
# Inspect schema
|
||||
sqlite3 ~/.local/state/quoteforge/qfs.db ".schema local_components"
|
||||
sqlite3 ~/.local/state/quoteforge/qfs.db ".schema local_configurations"
|
||||
|
||||
# Check pricelist item count
|
||||
sqlite3 ~/.local/state/quoteforge/qfs.db "SELECT COUNT(*) FROM local_pricelist_items"
|
||||
|
||||
# Check pending sync queue
|
||||
sqlite3 ~/.local/state/quoteforge/qfs.db "SELECT COUNT(*) FROM pending_changes"
|
||||
```
|
||||
MariaDB:
|
||||
- SQL files live in `migrations/`;
|
||||
- they are applied by `go run ./cmd/qfs -migrate`.
|
||||
|
||||
@@ -1,170 +1,125 @@
|
||||
# 04 — API and Web Routes
|
||||
# 04 - API
|
||||
|
||||
## API Endpoints
|
||||
## Public web routes
|
||||
|
||||
### Setup
|
||||
| Route | Purpose |
|
||||
| --- | --- |
|
||||
| `/` | configurator |
|
||||
| `/configs` | configuration list |
|
||||
| `/configs/:uuid/revisions` | revision history page |
|
||||
| `/projects` | project list |
|
||||
| `/projects/:uuid` | project detail |
|
||||
| `/pricelists` | pricelist list |
|
||||
| `/pricelists/:id` | pricelist detail |
|
||||
| `/partnumber-books` | partnumber book page |
|
||||
| `/setup` | DB setup page |
|
||||
|
||||
| Method | Endpoint | Purpose |
|
||||
|--------|----------|---------|
|
||||
| GET | `/setup` | Initial setup page |
|
||||
| POST | `/setup` | Save connection settings |
|
||||
| POST | `/setup/test` | Test MariaDB connection |
|
||||
| GET | `/setup/status` | Setup status |
|
||||
## Setup and health
|
||||
|
||||
### Components
|
||||
| Method | Path | Purpose |
|
||||
| --- | --- | --- |
|
||||
| `GET` | `/health` | process health |
|
||||
| `GET` | `/setup` | setup page |
|
||||
| `POST` | `/setup` | save tested DB settings |
|
||||
| `POST` | `/setup/test` | test DB connection |
|
||||
| `GET` | `/setup/status` | setup status |
|
||||
| `GET` | `/api/db-status` | current DB/sync status |
|
||||
| `GET` | `/api/current-user` | local user identity |
|
||||
| `GET` | `/api/ping` | lightweight API ping |
|
||||
|
||||
| Method | Endpoint | Purpose |
|
||||
|--------|----------|---------|
|
||||
| GET | `/api/components` | List components (metadata only) |
|
||||
| GET | `/api/components/:lot_name` | Component by lot_name |
|
||||
| GET | `/api/categories` | List categories |
|
||||
`POST /api/restart` exists only in `debug` mode.
|
||||
|
||||
### Quote
|
||||
## Reference data
|
||||
|
||||
| Method | Endpoint | Purpose |
|
||||
|--------|----------|---------|
|
||||
| POST | `/api/quote/validate` | Validate line items |
|
||||
| POST | `/api/quote/calculate` | Calculate quote (prices from pricelist) |
|
||||
| POST | `/api/quote/price-levels` | Prices by level (estimate/warehouse/competitor) |
|
||||
| Method | Path | Purpose |
|
||||
| --- | --- | --- |
|
||||
| `GET` | `/api/components` | list component metadata |
|
||||
| `GET` | `/api/components/:lot_name` | one component |
|
||||
| `GET` | `/api/categories` | list categories |
|
||||
| `GET` | `/api/pricelists` | list local pricelists |
|
||||
| `GET` | `/api/pricelists/latest` | latest pricelist by source |
|
||||
| `GET` | `/api/pricelists/:id` | pricelist header |
|
||||
| `GET` | `/api/pricelists/:id/items` | pricelist rows |
|
||||
| `GET` | `/api/pricelists/:id/lots` | lot names in a pricelist |
|
||||
| `GET` | `/api/partnumber-books` | local partnumber books |
|
||||
| `GET` | `/api/partnumber-books/:id` | book items by `server_id` |
|
||||
|
||||
### Pricelists (read-only)
|
||||
## Quote and export
|
||||
|
||||
| Method | Endpoint | Purpose |
|
||||
|--------|----------|---------|
|
||||
| GET | `/api/pricelists` | List pricelists (`source`, `active_only`, pagination) |
|
||||
| GET | `/api/pricelists/latest` | Latest pricelist by source |
|
||||
| GET | `/api/pricelists/:id` | Pricelist by ID |
|
||||
| GET | `/api/pricelists/:id/items` | Pricelist line items |
|
||||
| GET | `/api/pricelists/:id/lots` | Lot names in pricelist |
|
||||
| Method | Path | Purpose |
|
||||
| --- | --- | --- |
|
||||
| `POST` | `/api/quote/validate` | validate config items |
|
||||
| `POST` | `/api/quote/calculate` | calculate quote totals |
|
||||
| `POST` | `/api/quote/price-levels` | resolve estimate/warehouse/competitor prices |
|
||||
| `POST` | `/api/export/csv` | export a single configuration |
|
||||
| `GET` | `/api/configs/:uuid/export` | export a stored configuration |
|
||||
| `GET` | `/api/projects/:uuid/export` | legacy project BOM export |
|
||||
| `POST` | `/api/projects/:uuid/export` | pricing-tab project export |
|
||||
|
||||
`GET /api/pricelists?active_only=true` returns only pricelists that have synced items (`item_count > 0`).
|
||||
## Configurations
|
||||
|
||||
### Configurations
|
||||
| Method | Path | Purpose |
|
||||
| --- | --- | --- |
|
||||
| `GET` | `/api/configs` | list configurations |
|
||||
| `POST` | `/api/configs/import` | import configurations from server |
|
||||
| `POST` | `/api/configs` | create configuration |
|
||||
| `POST` | `/api/configs/preview-article` | preview article |
|
||||
| `GET` | `/api/configs/:uuid` | get configuration |
|
||||
| `PUT` | `/api/configs/:uuid` | update configuration |
|
||||
| `DELETE` | `/api/configs/:uuid` | archive configuration |
|
||||
| `POST` | `/api/configs/:uuid/reactivate` | reactivate configuration |
|
||||
| `PATCH` | `/api/configs/:uuid/rename` | rename configuration |
|
||||
| `POST` | `/api/configs/:uuid/clone` | clone configuration |
|
||||
| `POST` | `/api/configs/:uuid/refresh-prices` | refresh prices |
|
||||
| `PATCH` | `/api/configs/:uuid/project` | move configuration to project |
|
||||
| `GET` | `/api/configs/:uuid/versions` | list revisions |
|
||||
| `GET` | `/api/configs/:uuid/versions/:version` | get one revision |
|
||||
| `POST` | `/api/configs/:uuid/rollback` | rollback by creating a new head revision |
|
||||
| `PATCH` | `/api/configs/:uuid/server-count` | update server count |
|
||||
| `GET` | `/api/configs/:uuid/vendor-spec` | read vendor BOM |
|
||||
| `PUT` | `/api/configs/:uuid/vendor-spec` | replace vendor BOM |
|
||||
| `POST` | `/api/configs/:uuid/vendor-spec/resolve` | resolve PN -> LOT |
|
||||
| `POST` | `/api/configs/:uuid/vendor-spec/apply` | apply BOM to cart |
|
||||
|
||||
| Method | Endpoint | Purpose |
|
||||
|--------|----------|---------|
|
||||
| GET | `/api/configs` | List configurations |
|
||||
| POST | `/api/configs` | Create configuration |
|
||||
| GET | `/api/configs/:uuid` | Get configuration |
|
||||
| PUT | `/api/configs/:uuid` | Update configuration |
|
||||
| DELETE | `/api/configs/:uuid` | Archive configuration |
|
||||
| POST | `/api/configs/:uuid/refresh-prices` | Refresh prices from pricelist |
|
||||
| POST | `/api/configs/:uuid/clone` | Clone configuration |
|
||||
| POST | `/api/configs/:uuid/reactivate` | Restore archived configuration |
|
||||
| POST | `/api/configs/:uuid/rename` | Rename configuration |
|
||||
| POST | `/api/configs/preview-article` | Preview generated article for a configuration |
|
||||
| POST | `/api/configs/:uuid/rollback` | Roll back to a version |
|
||||
| GET | `/api/configs/:uuid/versions` | List versions |
|
||||
| GET | `/api/configs/:uuid/versions/:version` | Get specific version |
|
||||
## Projects
|
||||
|
||||
`line` field in configuration payloads is backed by persistent `line_no` in DB.
|
||||
| Method | Path | Purpose |
|
||||
| --- | --- | --- |
|
||||
| `GET` | `/api/projects` | paginated project list |
|
||||
| `GET` | `/api/projects/all` | lightweight list for dropdowns |
|
||||
| `POST` | `/api/projects` | create project |
|
||||
| `GET` | `/api/projects/:uuid` | get project |
|
||||
| `PUT` | `/api/projects/:uuid` | update project |
|
||||
| `POST` | `/api/projects/:uuid/archive` | archive project |
|
||||
| `POST` | `/api/projects/:uuid/reactivate` | reactivate project |
|
||||
| `DELETE` | `/api/projects/:uuid` | delete project variant only |
|
||||
| `GET` | `/api/projects/:uuid/configs` | list project configurations |
|
||||
| `PATCH` | `/api/projects/:uuid/configs/reorder` | persist line order |
|
||||
| `POST` | `/api/projects/:uuid/configs` | create configuration inside project |
|
||||
| `POST` | `/api/projects/:uuid/configs/:config_uuid/clone` | clone config into project |
|
||||
| `POST` | `/api/projects/:uuid/vendor-import` | import CFXML workspace into project |
|
||||
|
||||
### Projects
|
||||
Vendor import contract:
|
||||
- multipart field name is `file`;
|
||||
- file limit is `1 GiB`;
|
||||
- oversized payloads are rejected before XML parsing.
|
||||
|
||||
| Method | Endpoint | Purpose |
|
||||
|--------|----------|---------|
|
||||
| GET | `/api/projects` | List projects |
|
||||
| POST | `/api/projects` | Create project |
|
||||
| GET | `/api/projects/:uuid` | Get project |
|
||||
| PUT | `/api/projects/:uuid` | Update project |
|
||||
| DELETE | `/api/projects/:uuid` | Archive project variant (soft-delete via `is_active=false`; fails if project has no `variant` set — main projects cannot be deleted this way) |
|
||||
| GET | `/api/projects/:uuid/configs` | Project configurations |
|
||||
| PATCH | `/api/projects/:uuid/configs/reorder` | Reorder active project configurations (`ordered_uuids`) and persist `line_no` |
|
||||
| POST | `/api/projects/:uuid/vendor-import` | Import a vendor `CFXML` workspace into the existing project |
|
||||
## Sync
|
||||
|
||||
`GET /api/projects/:uuid/configs` ordering:
|
||||
`line ASC`, then `created_at DESC`, then `id DESC`.
|
||||
| Method | Path | Purpose |
|
||||
| --- | --- | --- |
|
||||
| `GET` | `/api/sync/status` | sync status |
|
||||
| `GET` | `/api/sync/readiness` | sync readiness |
|
||||
| `GET` | `/api/sync/info` | sync modal data |
|
||||
| `GET` | `/api/sync/users-status` | remote user status |
|
||||
| `GET` | `/api/sync/pending/count` | pending queue count |
|
||||
| `GET` | `/api/sync/pending` | pending queue rows |
|
||||
| `POST` | `/api/sync/components` | pull components |
|
||||
| `POST` | `/api/sync/pricelists` | pull pricelists |
|
||||
| `POST` | `/api/sync/partnumber-books` | pull partnumber books |
|
||||
| `POST` | `/api/sync/partnumber-seen` | report unresolved vendor PN |
|
||||
| `POST` | `/api/sync/all` | push and pull full sync |
|
||||
| `POST` | `/api/sync/push` | push pending changes |
|
||||
| `POST` | `/api/sync/repair` | repair broken pending rows |
|
||||
|
||||
`POST /api/projects/:uuid/vendor-import` accepts `multipart/form-data` with one required file field:
|
||||
|
||||
- `file` — vendor configurator export in `CFXML` format
|
||||
|
||||
### Sync
|
||||
|
||||
| Method | Endpoint | Purpose | Flow |
|
||||
|--------|----------|---------|------|
|
||||
| GET | `/api/sync/status` | Overall sync status | read-only |
|
||||
| GET | `/api/sync/readiness` | Preflight status (ready/blocked/unknown) | read-only |
|
||||
| GET | `/api/sync/info` | Data for sync modal | read-only |
|
||||
| GET | `/api/sync/users-status` | Users status | read-only |
|
||||
| GET | `/api/sync/pending` | List pending changes | read-only |
|
||||
| GET | `/api/sync/pending/count` | Count of pending changes | read-only |
|
||||
| POST | `/api/sync/push` | Push pending → MariaDB | SQLite → MariaDB |
|
||||
| POST | `/api/sync/components` | Pull components | MariaDB → SQLite |
|
||||
| POST | `/api/sync/pricelists` | Pull pricelists | MariaDB → SQLite |
|
||||
| POST | `/api/sync/all` | Full sync: push + pull + import | bidirectional |
|
||||
| POST | `/api/sync/repair` | Repair broken entries in pending_changes | SQLite |
|
||||
| POST | `/api/sync/partnumber-seen` | Report unresolved/ignored vendor PNs for server-side tracking | QuoteForge → MariaDB |
|
||||
|
||||
**If sync is blocked by the readiness guard:** all POST sync methods return `423 Locked` with `reason_code` and `reason_text`.
|
||||
|
||||
### Vendor Spec (BOM)
|
||||
|
||||
| Method | Endpoint | Purpose |
|
||||
|--------|----------|---------|
|
||||
| GET | `/api/configs/:uuid/vendor-spec` | Fetch stored vendor BOM |
|
||||
| PUT | `/api/configs/:uuid/vendor-spec` | Replace vendor BOM (full update) |
|
||||
| POST | `/api/configs/:uuid/vendor-spec/resolve` | Resolve PNs → LOTs (read-only) |
|
||||
| POST | `/api/configs/:uuid/vendor-spec/apply` | Apply resolved LOTs to cart |
|
||||
|
||||
Notes:
|
||||
- `GET` / `PUT /api/configs/:uuid/vendor-spec` exchange normalized BOM rows (`vendor_spec`), not raw pasted Excel layout.
|
||||
- BOM row contract stores canonical LOT mapping list as seen in BOM UI:
|
||||
- `lot_mappings[]`
|
||||
- each mapping contains `lot_name` + `quantity_per_pn`
|
||||
- `POST /api/configs/:uuid/vendor-spec/apply` rebuilds cart items from explicit BOM mappings:
|
||||
- all LOTs from `lot_mappings[]`
|
||||
|
||||
### Partnumber Books (read-only)
|
||||
|
||||
| Method | Endpoint | Purpose |
|
||||
|--------|----------|---------|
|
||||
| GET | `/api/partnumber-books` | List local book snapshots |
|
||||
| GET | `/api/partnumber-books/:id` | Items for a book by `server_id` (`page`, `per_page`, `search`) |
|
||||
| POST | `/api/sync/partnumber-books` | Pull book snapshots from MariaDB |
|
||||
|
||||
See [09-vendor-spec.md](09-vendor-spec.md) for schema and pull logic.
|
||||
See [09-vendor-spec.md](09-vendor-spec.md) for `vendor_spec` JSON schema and BOM UI mapping contract.
|
||||
|
||||
### Export
|
||||
|
||||
| Method | Endpoint | Purpose |
|
||||
|--------|----------|---------|
|
||||
| POST | `/api/export/csv` | Export configuration to CSV |
|
||||
| GET | `/api/projects/:uuid/export` | Legacy project CSV export in block BOM format |
|
||||
| POST | `/api/projects/:uuid/export` | Project CSV export in pricing-tab format with selectable columns (`include_lot`, `include_bom`, `include_estimate`, `include_stock`, `include_competitor`) |
|
||||
|
||||
**Export filename format:** `YYYY-MM-DD (ProjectCode) ConfigName Article.csv`
|
||||
(uses `project.Code`, not `project.Name`)
|
||||
|
||||
---
|
||||
|
||||
## Web Routes
|
||||
|
||||
| Route | Page |
|
||||
|-------|------|
|
||||
| `/configs` | Configuration list |
|
||||
| `/configurator` | Configurator |
|
||||
| `/configs/:uuid/revisions` | Configuration revision history |
|
||||
| `/projects` | Project list |
|
||||
| `/projects/:uuid` | Project details |
|
||||
| `/pricelists` | Pricelist list |
|
||||
| `/pricelists/:id` | Pricelist details |
|
||||
| `/partnumber-books` | Partnumber books (active book summary + snapshot history) |
|
||||
| `/setup` | Connection settings |
|
||||
|
||||
---
|
||||
|
||||
## Rollback API (details)
|
||||
|
||||
```bash
|
||||
POST /api/configs/:uuid/rollback
|
||||
Content-Type: application/json
|
||||
|
||||
{
|
||||
"target_version": 3,
|
||||
"note": "optional comment"
|
||||
}
|
||||
```
|
||||
|
||||
Response: updated configuration with the new version.
|
||||
When readiness is blocked, sync write endpoints return `423 Locked`.
|
||||
|
||||
@@ -1,141 +1,74 @@
|
||||
# 05 — Configuration and Environment
|
||||
# 05 - Config
|
||||
|
||||
## File Paths
|
||||
## Runtime files
|
||||
|
||||
### SQLite database (`qfs.db`)
|
||||
| Artifact | Default location |
|
||||
| --- | --- |
|
||||
| `qfs.db` | OS-specific user state directory |
|
||||
| `config.yaml` | same state directory as `qfs.db` |
|
||||
| `local_encryption.key` | same state directory as `qfs.db` |
|
||||
| `backups/` | next to `qfs.db` unless overridden |
|
||||
|
||||
| OS | Default path |
|
||||
|----|-------------|
|
||||
| macOS | `~/Library/Application Support/QuoteForge/qfs.db` |
|
||||
| Linux | `$XDG_STATE_HOME/quoteforge/qfs.db` or `~/.local/state/quoteforge/qfs.db` |
|
||||
| Windows | `%LOCALAPPDATA%\QuoteForge\qfs.db` |
|
||||
The runtime state directory can be overridden with `QFS_STATE_DIR`.
|
||||
Direct paths can be overridden with `QFS_DB_PATH` and `QFS_CONFIG_PATH`.
|
||||
|
||||
Override: `-localdb <path>` or `QFS_DB_PATH`.
|
||||
## Runtime config shape
|
||||
|
||||
### config.yaml
|
||||
|
||||
Searched in the same user-state directory as `qfs.db` by default.
|
||||
If the file does not exist, it is created automatically.
|
||||
If the format is outdated, it is automatically migrated to the runtime format (`server` + `logging` sections only).
|
||||
|
||||
Override: `-config <path>` or `QFS_CONFIG_PATH`.
|
||||
|
||||
**Important:** `config.yaml` is a runtime user file — it is **not stored in the repository**.
|
||||
`config.example.yaml` is the only config template in the repo.
|
||||
|
||||
### Local encryption key
|
||||
|
||||
Saved MariaDB credentials in SQLite are encrypted with:
|
||||
|
||||
1. `QUOTEFORGE_ENCRYPTION_KEY` if explicitly provided, otherwise
|
||||
2. an application-managed random key file stored at `<state dir>/local_encryption.key`.
|
||||
|
||||
Rules:
|
||||
- The key file is created automatically with mode `0600`.
|
||||
- The key file is not committed and is not included in normal backups.
|
||||
- Restoring `qfs.db` on another machine requires re-entering DB credentials unless the key file is migrated separately.
|
||||
|
||||
---
|
||||
|
||||
## config.yaml Structure
|
||||
Runtime keeps `config.yaml` intentionally small:
|
||||
|
||||
```yaml
|
||||
server:
|
||||
host: "0.0.0.0"
|
||||
host: "127.0.0.1"
|
||||
port: 8080
|
||||
mode: "release" # release | debug
|
||||
|
||||
logging:
|
||||
level: "info" # debug | info | warn | error
|
||||
format: "json" # json | text
|
||||
output: "stdout" # stdout | stderr | /path/to/file
|
||||
mode: "release"
|
||||
read_timeout: 30s
|
||||
write_timeout: 30s
|
||||
|
||||
backup:
|
||||
time: "00:00" # HH:MM in local time
|
||||
time: "00:00"
|
||||
|
||||
logging:
|
||||
level: "info"
|
||||
format: "json"
|
||||
output: "stdout"
|
||||
```
|
||||
|
||||
---
|
||||
Rules:
|
||||
- QuoteForge creates this file automatically if it does not exist;
|
||||
- startup rewrites legacy config files into this minimal runtime shape;
|
||||
- startup normalizes any `server.host` value to `127.0.0.1` before saving the runtime config;
|
||||
- `server.host` must stay on loopback.
|
||||
|
||||
## Environment Variables
|
||||
Saved MariaDB credentials do not live in `config.yaml`.
|
||||
They are stored in SQLite and encrypted with `local_encryption.key` unless `QUOTEFORGE_ENCRYPTION_KEY` overrides the key material.
|
||||
|
||||
| Variable | Description | Default |
|
||||
|----------|-------------|---------|
|
||||
| `QFS_DB_PATH` | Full path to SQLite DB | OS-specific user state dir |
|
||||
| `QFS_STATE_DIR` | State directory (if `QFS_DB_PATH` is not set) | OS-specific user state dir |
|
||||
| `QFS_CONFIG_PATH` | Full path to `config.yaml` | OS-specific user state dir |
|
||||
| `QFS_BACKUP_DIR` | Root directory for rotating backups | `<db dir>/backups` |
|
||||
| `QFS_BACKUP_DISABLE` | Disable automatic backups | — |
|
||||
| `QUOTEFORGE_ENCRYPTION_KEY` | Explicit override for local credential encryption key | app-managed key file |
|
||||
| `QF_DB_HOST` | MariaDB host | localhost |
|
||||
| `QF_DB_PORT` | MariaDB port | 3306 |
|
||||
| `QF_DB_NAME` | Database name | RFQ_LOG |
|
||||
| `QF_DB_USER` | DB user | — |
|
||||
| `QF_DB_PASSWORD` | DB password | — |
|
||||
| `QF_SERVER_PORT` | HTTP server port | 8080 |
|
||||
## Environment variables
|
||||
|
||||
`QFS_BACKUP_DISABLE` accepts: `1`, `true`, `yes`.
|
||||
| Variable | Purpose |
|
||||
| --- | --- |
|
||||
| `QFS_STATE_DIR` | override runtime state directory |
|
||||
| `QFS_DB_PATH` | explicit SQLite path |
|
||||
| `QFS_CONFIG_PATH` | explicit config path |
|
||||
| `QFS_BACKUP_DIR` | explicit backup root |
|
||||
| `QFS_BACKUP_DISABLE` | disable rotating backups |
|
||||
| `QUOTEFORGE_ENCRYPTION_KEY` | override encryption key |
|
||||
| `QF_SERVER_PORT` | override HTTP port |
|
||||
|
||||
---
|
||||
`QFS_BACKUP_DISABLE` accepts `1`, `true`, or `yes`.
|
||||
|
||||
## CLI Flags
|
||||
## CLI flags
|
||||
|
||||
| Flag | Description |
|
||||
|------|-------------|
|
||||
| `-config <path>` | Path to config.yaml |
|
||||
| `-localdb <path>` | Path to SQLite DB |
|
||||
| `-reset-localdb` | Reset local DB (destructive!) |
|
||||
| `-migrate` | Apply pending migrations and exit |
|
||||
| `-version` | Print version and exit |
|
||||
| Flag | Purpose |
|
||||
| --- | --- |
|
||||
| `-config <path>` | config file path |
|
||||
| `-localdb <path>` | SQLite path |
|
||||
| `-reset-localdb` | destructive local DB reset |
|
||||
| `-migrate` | apply server migrations and exit |
|
||||
| `-version` | print app version and exit |
|
||||
|
||||
---
|
||||
## First run
|
||||
|
||||
## Installation and First Run
|
||||
|
||||
### Requirements
|
||||
- Go 1.22 or higher
|
||||
- MariaDB 11.x (or MySQL 8.x)
|
||||
- ~50 MB disk space
|
||||
|
||||
### Steps
|
||||
|
||||
```bash
|
||||
# 1. Clone the repository
|
||||
git clone <repo-url>
|
||||
cd quoteforge
|
||||
|
||||
# 2. Apply migrations
|
||||
go run ./cmd/qfs -migrate
|
||||
|
||||
# 3. Start
|
||||
go run ./cmd/qfs
|
||||
# or
|
||||
make run
|
||||
```
|
||||
|
||||
Application is available at: http://localhost:8080
|
||||
|
||||
On first run, `/setup` opens for configuring the MariaDB connection.
|
||||
|
||||
### OPS Project Migrator
|
||||
|
||||
Migrates quotes whose names start with `OPS-xxxx` (where `x` is a digit) into a project named `OPS-xxxx`.
|
||||
|
||||
```bash
|
||||
# Preview first (always)
|
||||
go run ./cmd/migrate_ops_projects
|
||||
|
||||
# Apply
|
||||
go run ./cmd/migrate_ops_projects -apply
|
||||
|
||||
# Apply without interactive confirmation
|
||||
go run ./cmd/migrate_ops_projects -apply -yes
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Docker
|
||||
|
||||
```bash
|
||||
docker build -t quoteforge .
|
||||
docker-compose up -d
|
||||
```
|
||||
1. runtime ensures `config.yaml` exists;
|
||||
2. runtime opens the local SQLite database;
|
||||
3. if no stored MariaDB credentials exist, `/setup` is served;
|
||||
4. after setup, runtime works locally and sync uses saved DB settings in the background.
|
||||
|
||||
@@ -1,227 +1,55 @@
|
||||
# 06 — Backup
|
||||
# 06 - Backup
|
||||
|
||||
## Overview
|
||||
## Scope
|
||||
|
||||
Automatic rotating ZIP backup system for local data.
|
||||
QuoteForge creates rotating local ZIP backups of:
|
||||
- a consistent SQLite snapshot saved as `qfs.db`;
|
||||
- `config.yaml` when present.
|
||||
|
||||
**What is included in each archive:**
|
||||
- SQLite DB (`qfs.db`)
|
||||
- SQLite sidecars (`qfs.db-wal`, `qfs.db-shm`) if present
|
||||
- `config.yaml` if present
|
||||
The backup intentionally does not include `local_encryption.key`.
|
||||
|
||||
**Archive name format:** `qfs-backp-YYYY-MM-DD.zip`
|
||||
## Location and naming
|
||||
|
||||
Default root:
|
||||
- `<db dir>/backups`
|
||||
|
||||
Subdirectories:
|
||||
- `daily/`
|
||||
- `weekly/`
|
||||
- `monthly/`
|
||||
- `yearly/`
|
||||
|
||||
Archive name:
|
||||
- `qfs-backp-YYYY-MM-DD.zip`
|
||||
|
||||
## Retention
|
||||
|
||||
**Retention policy:**
|
||||
| Period | Keep |
|
||||
|--------|------|
|
||||
| Daily | 7 archives |
|
||||
| Weekly | 4 archives |
|
||||
| Monthly | 12 archives |
|
||||
| Yearly | 10 archives |
|
||||
|
||||
**Directories:** `<backup root>/daily`, `/weekly`, `/monthly`, `/yearly`
|
||||
|
||||
---
|
||||
|
||||
## Configuration
|
||||
|
||||
```yaml
|
||||
backup:
|
||||
time: "00:00" # Trigger time in local time (HH:MM format)
|
||||
```
|
||||
|
||||
**Environment variables:**
|
||||
- `QFS_BACKUP_DIR` — backup root directory (default: `<db dir>/backups`)
|
||||
- `QFS_BACKUP_DISABLE` — disable backups (`1/true/yes`)
|
||||
|
||||
**Safety rules:**
|
||||
- Backup root must resolve outside any git worktree.
|
||||
- If `qfs.db` is placed inside a repository checkout, default backups are rejected until `QFS_BACKUP_DIR` points outside the repo.
|
||||
- Backup archives intentionally do **not** include `local_encryption.key`; restored installations on another machine must re-enter DB credentials.
|
||||
|
||||
---
|
||||
| --- | --- |
|
||||
| Daily | 7 |
|
||||
| Weekly | 4 |
|
||||
| Monthly | 12 |
|
||||
| Yearly | 10 |
|
||||
|
||||
## Behavior
|
||||
|
||||
- **At startup:** if no backup exists for the current period, one is created immediately
|
||||
- **Daily:** at the configured time, a new backup is created
|
||||
- **Deduplication:** prevented via a `.period.json` marker file in each period directory
|
||||
- **Rotation:** excess old archives are deleted automatically
|
||||
- on startup, QuoteForge creates a backup if the current period has none yet;
|
||||
- a daily scheduler creates the next backup at `backup.time`;
|
||||
- duplicate snapshots inside the same period are prevented by a period marker file;
|
||||
- old archives are pruned automatically.
|
||||
|
||||
---
|
||||
## Safety rules
|
||||
|
||||
## Implementation
|
||||
- backup root must be outside the git worktree;
|
||||
- backup creation is blocked if the resolved backup root sits inside the repository;
|
||||
- SQLite snapshot must be created from a consistent database copy, not by copying live WAL files directly;
|
||||
- restore to another machine requires re-entering DB credentials unless the encryption key is migrated separately.
|
||||
|
||||
Module: `internal/appstate/backup.go`
|
||||
## Restore
|
||||
|
||||
Main function:
|
||||
```go
|
||||
func EnsureRotatingLocalBackup(dbPath, configPath string) ([]string, error)
|
||||
```
|
||||
|
||||
Scheduler (in `main.go`):
|
||||
```go
|
||||
func startBackupScheduler(ctx context.Context, cfg *config.Config, dbPath, configPath string)
|
||||
```
|
||||
|
||||
### Config struct
|
||||
|
||||
```go
|
||||
type BackupConfig struct {
|
||||
Time string `yaml:"time"`
|
||||
}
|
||||
// Default: "00:00"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Implementation Notes
|
||||
|
||||
- `backup.time` is in **local time** without timezone offset parsing
|
||||
- `.period.json` is the marker that prevents duplicate backups within the same period
|
||||
- Archive filenames contain only the date; uniqueness is ensured by per-period directories + the period marker
|
||||
- When changing naming or retention: update both the filename logic and the prune logic together
|
||||
- Git worktree detection is path-based (`.git` ancestor check) and blocks backup creation inside the repo tree
|
||||
|
||||
---
|
||||
|
||||
## Full Listing: `internal/appstate/backup.go`
|
||||
|
||||
```go
|
||||
package appstate
|
||||
|
||||
import (
|
||||
"archive/zip"
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"io"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"sort"
|
||||
"strings"
|
||||
"time"
|
||||
)
|
||||
|
||||
type backupPeriod struct {
|
||||
name string
|
||||
retention int
|
||||
key func(time.Time) string
|
||||
date func(time.Time) string
|
||||
}
|
||||
|
||||
var backupPeriods = []backupPeriod{
|
||||
{
|
||||
name: "daily",
|
||||
retention: 7,
|
||||
key: func(t time.Time) string { return t.Format("2006-01-02") },
|
||||
date: func(t time.Time) string { return t.Format("2006-01-02") },
|
||||
},
|
||||
{
|
||||
name: "weekly",
|
||||
retention: 4,
|
||||
key: func(t time.Time) string {
|
||||
y, w := t.ISOWeek()
|
||||
return fmt.Sprintf("%04d-W%02d", y, w)
|
||||
},
|
||||
date: func(t time.Time) string { return t.Format("2006-01-02") },
|
||||
},
|
||||
{
|
||||
name: "monthly",
|
||||
retention: 12,
|
||||
key: func(t time.Time) string { return t.Format("2006-01") },
|
||||
date: func(t time.Time) string { return t.Format("2006-01-02") },
|
||||
},
|
||||
{
|
||||
name: "yearly",
|
||||
retention: 10,
|
||||
key: func(t time.Time) string { return t.Format("2006") },
|
||||
date: func(t time.Time) string { return t.Format("2006-01-02") },
|
||||
},
|
||||
}
|
||||
|
||||
func EnsureRotatingLocalBackup(dbPath, configPath string) ([]string, error) {
|
||||
if isBackupDisabled() || dbPath == "" {
|
||||
return nil, nil
|
||||
}
|
||||
if _, err := os.Stat(dbPath); os.IsNotExist(err) {
|
||||
return nil, nil
|
||||
}
|
||||
root := resolveBackupRoot(dbPath)
|
||||
now := time.Now()
|
||||
created := make([]string, 0)
|
||||
for _, period := range backupPeriods {
|
||||
newFiles, err := ensurePeriodBackup(root, period, now, dbPath, configPath)
|
||||
if err != nil {
|
||||
return created, err
|
||||
}
|
||||
created = append(created, newFiles...)
|
||||
}
|
||||
return created, nil
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Full Listing: Scheduler Hook (`main.go`)
|
||||
|
||||
```go
|
||||
func startBackupScheduler(ctx context.Context, cfg *config.Config, dbPath, configPath string) {
|
||||
if cfg == nil {
|
||||
return
|
||||
}
|
||||
hour, minute, err := parseBackupTime(cfg.Backup.Time)
|
||||
if err != nil {
|
||||
slog.Warn("invalid backup time; using 00:00", "value", cfg.Backup.Time, "error", err)
|
||||
hour, minute = 0, 0
|
||||
}
|
||||
|
||||
// Startup check: create backup immediately if none exists for current periods
|
||||
if created, backupErr := appstate.EnsureRotatingLocalBackup(dbPath, configPath); backupErr != nil {
|
||||
slog.Error("local backup failed", "error", backupErr)
|
||||
} else {
|
||||
for _, path := range created {
|
||||
slog.Info("local backup completed", "archive", path)
|
||||
}
|
||||
}
|
||||
|
||||
for {
|
||||
next := nextBackupTime(time.Now(), hour, minute)
|
||||
timer := time.NewTimer(time.Until(next))
|
||||
select {
|
||||
case <-ctx.Done():
|
||||
timer.Stop()
|
||||
return
|
||||
case <-timer.C:
|
||||
start := time.Now()
|
||||
created, backupErr := appstate.EnsureRotatingLocalBackup(dbPath, configPath)
|
||||
duration := time.Since(start)
|
||||
if backupErr != nil {
|
||||
slog.Error("local backup failed", "error", backupErr, "duration", duration)
|
||||
} else {
|
||||
for _, path := range created {
|
||||
slog.Info("local backup completed", "archive", path, "duration", duration)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func parseBackupTime(value string) (int, int, error) {
|
||||
if strings.TrimSpace(value) == "" {
|
||||
return 0, 0, fmt.Errorf("empty backup time")
|
||||
}
|
||||
parsed, err := time.Parse("15:04", value)
|
||||
if err != nil {
|
||||
return 0, 0, err
|
||||
}
|
||||
return parsed.Hour(), parsed.Minute(), nil
|
||||
}
|
||||
|
||||
func nextBackupTime(now time.Time, hour, minute int) time.Time {
|
||||
target := time.Date(now.Year(), now.Month(), now.Day(), hour, minute, 0, 0, now.Location())
|
||||
if !now.Before(target) {
|
||||
target = target.Add(24 * time.Hour)
|
||||
}
|
||||
return target
|
||||
}
|
||||
```
|
||||
1. stop QuoteForge;
|
||||
2. unpack the chosen archive outside the repository;
|
||||
3. replace `qfs.db`;
|
||||
4. replace `config.yaml` if needed;
|
||||
5. restart the app;
|
||||
6. re-enter MariaDB credentials if the original encryption key is unavailable.
|
||||
|
||||
@@ -1,141 +1,35 @@
|
||||
# 07 — Development
|
||||
# 07 - Development
|
||||
|
||||
## Commands
|
||||
## Common commands
|
||||
|
||||
```bash
|
||||
# Run (dev)
|
||||
go run ./cmd/qfs
|
||||
make run
|
||||
|
||||
# Build
|
||||
make build-release # Optimized build with version info
|
||||
CGO_ENABLED=0 go build -o bin/qfs ./cmd/qfs
|
||||
|
||||
# Cross-platform build
|
||||
make build-all # Linux, macOS, Windows
|
||||
make build-windows # Windows only
|
||||
|
||||
# Verification
|
||||
go build ./cmd/qfs # Must compile without errors
|
||||
go vet ./... # Linter
|
||||
|
||||
# Tests
|
||||
go run ./cmd/qfs -migrate
|
||||
go run ./cmd/migrate_project_updated_at
|
||||
go test ./...
|
||||
make test
|
||||
|
||||
# Utilities
|
||||
make install-hooks # Git hooks (block committing secrets)
|
||||
make clean # Clean bin/
|
||||
make help # All available commands
|
||||
go vet ./...
|
||||
make build-release
|
||||
make install-hooks
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Code Style
|
||||
|
||||
- **Formatting:** `gofmt` (mandatory)
|
||||
- **Logging:** `slog` only (structured logging to the binary's stdout/stderr). No `console.log` or any other logging in browser-side JS — the browser console is never used for diagnostics.
|
||||
- **Errors:** explicit wrapping with context (`fmt.Errorf("context: %w", err)`)
|
||||
- **Style:** no unnecessary abstractions; minimum code for the task
|
||||
|
||||
---
|
||||
|
||||
## Guardrails
|
||||
|
||||
### What Must Never Be Restored
|
||||
- run `gofmt` before commit;
|
||||
- use `slog` for server logging;
|
||||
- keep runtime business logic SQLite-only;
|
||||
- limit MariaDB access to sync, setup, and migration tooling;
|
||||
- keep `config.yaml` out of git and use `config.example.yaml` only as a template;
|
||||
- update `bible-local/` in the same commit as architecture changes.
|
||||
|
||||
The following components were **intentionally removed** and must not be brought back:
|
||||
- cron jobs
|
||||
- importer utility
|
||||
- admin pricing UI/API
|
||||
- alerts
|
||||
- stock import
|
||||
## Removed features that must not return
|
||||
|
||||
### Configuration Files
|
||||
- admin pricing UI/API;
|
||||
- alerts and notification workflows;
|
||||
- stock import tooling;
|
||||
- cron jobs;
|
||||
- standalone importer utility.
|
||||
|
||||
- `config.yaml` — runtime user file, **not stored in the repository**
|
||||
- `config.example.yaml` — the only config template in the repo
|
||||
## Release notes
|
||||
|
||||
### Sync and Local-First
|
||||
|
||||
- Any sync changes must preserve local-first behavior
|
||||
- Local CRUD must not be blocked when MariaDB is unavailable
|
||||
- Runtime business code must not query MariaDB directly; all normal reads/writes go through SQLite snapshots
|
||||
- Direct MariaDB access is allowed only in `internal/services/sync/*` and dedicated setup/migration tools under `cmd/`
|
||||
- `connMgr.GetDB()` in handlers/services outside sync is a code review failure unless the code is strictly setup or operator tooling
|
||||
- Local SQLite migrations must be implemented in code; do not add a server-side registry of client SQLite SQL patches
|
||||
- Read-only local cache tables may be reset during startup recovery if migration fails; do not apply that strategy to user-authored tables like configurations, projects, pending changes, or connection settings
|
||||
|
||||
### Formats and UI
|
||||
|
||||
- **CSV export:** filename must use **project code** (`project.Code`), not project name
|
||||
Format: `YYYY-MM-DD (ProjectCode) ConfigName Article.csv`
|
||||
- **Breadcrumbs UI:** names longer than 16 characters must be truncated with an ellipsis
|
||||
|
||||
### Architecture Documentation
|
||||
|
||||
- **Every architectural decision must be recorded in `bible/`**
|
||||
- The corresponding Bible file must be updated **in the same commit** as the code change
|
||||
- On every user-requested commit, review and update the Bible in that same commit
|
||||
|
||||
---
|
||||
|
||||
## Common Tasks
|
||||
|
||||
### Add a Field to Configuration
|
||||
|
||||
1. Add the field to `LocalConfiguration` struct (`internal/models/`)
|
||||
2. Add GORM tags for the DB column
|
||||
3. Write a SQL migration (`migrations/`)
|
||||
4. Update `ConfigurationToLocal` / `LocalToConfiguration` converters
|
||||
5. Update API handlers and services
|
||||
|
||||
### Add a Field to Component
|
||||
|
||||
1. Add the field to `LocalComponent` struct (`internal/models/`)
|
||||
2. Update the SQL query in `SyncComponents()`
|
||||
3. Update the `componentRow` struct to match
|
||||
4. Update converter functions
|
||||
|
||||
### Add a Pricelist Price Lookup
|
||||
|
||||
```go
|
||||
// Modern pattern
|
||||
price, found := s.lookupPriceByPricelistID(pricelistID, lotName)
|
||||
if found && price > 0 {
|
||||
// use price
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Known Gotchas
|
||||
|
||||
1. **`CurrentPrice` removed from components** — any code using it will fail to compile
|
||||
2. **`HasPrice` filter removed** — `component.go ListComponents` no longer supports this filter
|
||||
3. **Quote calculation:** always SQLite-only; do not add a live MariaDB fallback
|
||||
4. **Items JSON:** prices are stored in the `items` field of the configuration, not fetched from components
|
||||
5. **Migrations are additive:** already-applied migrations are skipped (checked by `id` in `local_schema_migrations`)
|
||||
6. **`SyncedAt` removed:** last component sync time is now in `app_settings` (key=`last_component_sync`)
|
||||
|
||||
---
|
||||
|
||||
## Debugging Price Issues
|
||||
|
||||
**Problem: quote returns no prices**
|
||||
1. Check that `pricelist_id` is set on the configuration
|
||||
2. Check that pricelist items exist: `SELECT COUNT(*) FROM local_pricelist_items`
|
||||
3. Check `lookupPriceByPricelistID()` in `quote.go`
|
||||
4. Verify the correct source is used (estimate/warehouse/competitor)
|
||||
|
||||
**Problem: component sync not working**
|
||||
1. Components sync as metadata only — no prices
|
||||
2. Prices come via a separate pricelist sync
|
||||
3. Check `SyncComponents()` and the MariaDB query
|
||||
|
||||
**Problem: configuration refresh does not update prices**
|
||||
1. Refresh uses the latest estimate pricelist by default
|
||||
2. Latest resolution ignores pricelists without items (`EXISTS local_pricelist_items`)
|
||||
3. Old prices in `config.items` are preserved if a line item is not found in the pricelist
|
||||
4. To force a pricelist update: set `configuration.pricelist_id`
|
||||
5. In configurator, `Авто` must remain auto-mode (runtime resolved ID must not be persisted as explicit selection)
|
||||
Release history belongs under `releases/<version>/RELEASE_NOTES.md`.
|
||||
Do not keep temporary change summaries in the repository root.
|
||||
|
||||
@@ -1,568 +1,64 @@
|
||||
# 09 — Vendor Spec (BOM Import)
|
||||
# 09 - Vendor BOM
|
||||
|
||||
## Overview
|
||||
## Storage contract
|
||||
|
||||
The vendor spec feature allows importing a vendor BOM (Bill of Materials) into a configuration. It maps vendor part numbers (PN) to internal LOT names using an active partnumber book (snapshot pulled from PriceForge), then aggregates quantities to populate the Estimate (cart).
|
||||
Vendor BOM is stored in `local_configurations.vendor_spec` and synced with `qt_configurations.vendor_spec`.
|
||||
|
||||
---
|
||||
|
||||
## Architecture
|
||||
|
||||
### Storage
|
||||
|
||||
| Data | Storage | Sync direction |
|
||||
|------|---------|---------------|
|
||||
| `vendor_spec` JSON | `local_configurations.vendor_spec` (TEXT, JSON-encoded) | Two-way via `pending_changes` |
|
||||
| Partnumber book snapshots | `local_partnumber_books` + `local_partnumber_book_items` | Pull-only from PriceForge |
|
||||
|
||||
`vendor_spec` is a JSON array of `VendorSpecItem` objects stored inside the configuration row.
|
||||
It syncs to MariaDB `qt_configurations.vendor_spec` via the existing pending_changes mechanism.
|
||||
|
||||
Legacy storage note:
|
||||
|
||||
- QuoteForge does not use `qt_bom`
|
||||
- QuoteForge does not use `qt_lot_bundles`
|
||||
- QuoteForge does not use `qt_lot_bundle_items`
|
||||
|
||||
The only canonical persisted BOM contract for QuoteForge is `qt_configurations.vendor_spec`.
|
||||
|
||||
### `vendor_spec` JSON Schema
|
||||
|
||||
```json
|
||||
[
|
||||
{
|
||||
"sort_order": 10,
|
||||
"vendor_partnumber": "ABC-123",
|
||||
"quantity": 2,
|
||||
"description": "...",
|
||||
"unit_price": 4500.00,
|
||||
"total_price": 9000.00,
|
||||
"lot_mappings": [
|
||||
{ "lot_name": "LOT_A", "quantity_per_pn": 1 },
|
||||
{ "lot_name": "LOT_B", "quantity_per_pn": 2 }
|
||||
]
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
`lot_mappings[]` is the canonical persisted LOT mapping list for a BOM row.
|
||||
Each mapping entry stores:
|
||||
|
||||
- `lot_name`
|
||||
- `quantity_per_pn` (how many units of this LOT are included in **one vendor PN**)
|
||||
|
||||
### PN → LOT Mapping Contract (single LOT, multiplier, bundle)
|
||||
|
||||
QuoteForge expects the server to return/store BOM rows (`vendor_spec`) using a single canonical mapping list:
|
||||
|
||||
- `lot_mappings[]` contains **all** LOT mappings for the PN row (single LOT and bundle cases alike)
|
||||
- the list stores exactly what the user sees in BOM (LOT + "LOT в 1 PN")
|
||||
- the DB contract does **not** split mappings into "base LOT" vs "bundle LOTs"
|
||||
|
||||
#### Final quantity contribution to Estimate
|
||||
|
||||
For one BOM row with vendor PN quantity `pn_qty`:
|
||||
|
||||
- each mapping contribution:
|
||||
- `lot_qty = pn_qty * lot_mappings[i].quantity_per_pn`
|
||||
|
||||
#### Example: one PN maps to multiple LOTs
|
||||
Each row uses this canonical shape:
|
||||
|
||||
```json
|
||||
{
|
||||
"vendor_partnumber": "SYS-821GE-TNHR",
|
||||
"quantity": 3,
|
||||
"sort_order": 10,
|
||||
"vendor_partnumber": "ABC-123",
|
||||
"quantity": 2,
|
||||
"description": "row description",
|
||||
"unit_price": 4500.0,
|
||||
"total_price": 9000.0,
|
||||
"lot_mappings": [
|
||||
{ "lot_name": "CHASSIS_X13_8GPU", "quantity_per_pn": 1 },
|
||||
{ "lot_name": "PS_3000W_Titanium", "quantity_per_pn": 2 },
|
||||
{ "lot_name": "RAILKIT_X13", "quantity_per_pn": 1 }
|
||||
{ "lot_name": "LOT_A", "quantity_per_pn": 1 }
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
This row contributes to Estimate:
|
||||
Rules:
|
||||
- `lot_mappings[]` is the only persisted PN -> LOT mapping contract;
|
||||
- QuoteForge does not use legacy BOM tables;
|
||||
- apply flow rebuilds cart rows from `lot_mappings[]`.
|
||||
|
||||
- `CHASSIS_X13_8GPU` → `3 * 1 = 3`
|
||||
- `PS_3000W_Titanium` → `3 * 2 = 6`
|
||||
- `RAILKIT_X13` → `3 * 1 = 3`
|
||||
## Partnumber books
|
||||
|
||||
---
|
||||
Partnumber books are pull-only snapshots from PriceForge.
|
||||
|
||||
## Partnumber Books (Snapshots)
|
||||
Local tables:
|
||||
- `local_partnumber_books`
|
||||
- `local_partnumber_book_items`
|
||||
|
||||
Partnumber books are immutable versioned snapshots of the global PN→LOT mapping table, analogous to pricelists. PriceForge creates new snapshots; QuoteForge only pulls and reads them.
|
||||
Server tables:
|
||||
- `qt_partnumber_books`
|
||||
- `qt_partnumber_book_items`
|
||||
|
||||
### SQLite (local mirror)
|
||||
Resolution flow:
|
||||
1. load the active local book;
|
||||
2. find `vendor_partnumber`;
|
||||
3. copy `lots_json` into `lot_mappings[]`;
|
||||
4. keep unresolved rows editable in the UI.
|
||||
|
||||
```sql
|
||||
CREATE TABLE local_partnumber_books (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
server_id INTEGER UNIQUE NOT NULL, -- id from qt_partnumber_books
|
||||
version TEXT NOT NULL, -- format YYYY-MM-DD-NNN
|
||||
created_at DATETIME NOT NULL,
|
||||
is_active INTEGER NOT NULL DEFAULT 1
|
||||
);
|
||||
## CFXML import
|
||||
|
||||
CREATE TABLE local_partnumber_book_items (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
partnumber TEXT NOT NULL,
|
||||
lots_json TEXT NOT NULL,
|
||||
description TEXT
|
||||
);
|
||||
CREATE UNIQUE INDEX idx_local_book_pn ON local_partnumber_book_items(partnumber);
|
||||
```
|
||||
|
||||
**Active book query:** `WHERE is_active=1 ORDER BY created_at DESC, id DESC LIMIT 1`
|
||||
|
||||
**Schema creation:** GORM AutoMigrate (not `runLocalMigrations`).
|
||||
|
||||
### MariaDB (managed exclusively by PriceForge)
|
||||
|
||||
```sql
|
||||
CREATE TABLE qt_partnumber_books (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
version VARCHAR(50) NOT NULL,
|
||||
created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
is_active TINYINT(1) NOT NULL DEFAULT 1,
|
||||
partnumbers_json LONGTEXT NOT NULL
|
||||
);
|
||||
|
||||
CREATE TABLE qt_partnumber_book_items (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
partnumber VARCHAR(255) NOT NULL,
|
||||
lots_json LONGTEXT NOT NULL,
|
||||
description VARCHAR(10000) NULL,
|
||||
UNIQUE KEY uq_qt_partnumber_book_items_partnumber (partnumber)
|
||||
);
|
||||
|
||||
ALTER TABLE qt_configurations ADD COLUMN vendor_spec JSON NULL;
|
||||
```
|
||||
|
||||
QuoteForge has `SELECT` permission only on `qt_partnumber_books` and `qt_partnumber_book_items`. All writes are managed by PriceForge.
|
||||
|
||||
**Grant (add to existing user setup):**
|
||||
```sql
|
||||
GRANT SELECT ON RFQ_LOG.qt_partnumber_books TO '<DB_USER>'@'%';
|
||||
GRANT SELECT ON RFQ_LOG.qt_partnumber_book_items TO '<DB_USER>'@'%';
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Resolution Algorithm (3-step)
|
||||
|
||||
For each `vendor_partnumber` in the BOM, QuoteForge builds/updates UI-visible LOT mappings:
|
||||
|
||||
1. **Active book lookup** — read active `local_partnumber_books`, verify PN membership in `partnumbers_json`, then query `local_partnumber_book_items WHERE partnumber = ?`.
|
||||
2. **Populate BOM UI** — if a match exists, BOM row gets `lot_mappings[]` from `lots_json` (user can still edit it).
|
||||
3. **Unresolved** — red row + inline LOT input with strict autocomplete.
|
||||
|
||||
Persistence note: the application stores the final user-visible mappings in `lot_mappings[]` (not separate "resolved/manual" persisted fields).
|
||||
|
||||
---
|
||||
|
||||
## CFXML Workspace Import Contract
|
||||
|
||||
QuoteForge may import a vendor configurator workspace in `CFXML` format as an existing project update path.
|
||||
This import path must convert one external workspace into one QuoteForge project containing multiple configurations.
|
||||
|
||||
### Import Unit Boundaries
|
||||
|
||||
- One `CFXML` workspace file = one QuoteForge project import session.
|
||||
- One top-level configuration group inside the workspace = one QuoteForge configuration.
|
||||
- Software rows are **not** imported as standalone configurations.
|
||||
- All software rows must be attached to the configuration group they belong to.
|
||||
|
||||
### Configuration Grouping
|
||||
|
||||
Top-level `ProductLineItem` rows are grouped by:
|
||||
|
||||
- `ProprietaryGroupIdentifier`
|
||||
|
||||
This field is the canonical boundary of one imported configuration.
|
||||
`POST /api/projects/:uuid/vendor-import` imports one vendor workspace into an existing project.
|
||||
|
||||
Rules:
|
||||
|
||||
1. Read all top-level `ProductLineItem` rows in document order.
|
||||
2. Group them by `ProprietaryGroupIdentifier`.
|
||||
3. Preserve document order of groups by the first encountered `ProductLineNumber`.
|
||||
4. Import each group as exactly one QuoteForge configuration.
|
||||
|
||||
`ConfigurationGroupLineNumberReference` is not sufficient for grouping imported configurations because
|
||||
multiple independent configuration groups may share the same value in one workspace.
|
||||
|
||||
### Primary Row Selection (no SKU hardcode)
|
||||
|
||||
The importer must not hardcode vendor, model, or server SKU values.
|
||||
|
||||
Within each `ProprietaryGroupIdentifier` group, the importer selects one primary top-level row using
|
||||
structural rules only:
|
||||
|
||||
1. Prefer rows with `ProductTypeCode = Hardware`.
|
||||
2. If multiple rows match, prefer the row with the largest number of `ProductSubLineItem` children.
|
||||
3. If there is still a tie, prefer the first row by `ProductLineNumber`.
|
||||
|
||||
The primary row provides configuration-level metadata such as:
|
||||
|
||||
- configuration name
|
||||
- server count
|
||||
- server model / description
|
||||
- article / support code candidate
|
||||
|
||||
### Software Inclusion Rule
|
||||
|
||||
All top-level rows belonging to the same `ProprietaryGroupIdentifier` must be imported into the same
|
||||
QuoteForge configuration, including:
|
||||
|
||||
- `Hardware`
|
||||
- `Software`
|
||||
- instruction / service rows represented as software-like items
|
||||
|
||||
Effects:
|
||||
|
||||
- a workspace never creates a separate configuration made only of software;
|
||||
- `software1`, `software2`, license rows, and instruction rows stay inside the related configuration;
|
||||
- the user sees one complete configuration instead of fragmented partial imports.
|
||||
|
||||
### Mapping to QuoteForge Project / Configuration
|
||||
|
||||
For one imported configuration group:
|
||||
|
||||
- QuoteForge configuration `name` <- primary row `ProductName`
|
||||
- QuoteForge configuration `server_count` <- primary row `Quantity`
|
||||
- QuoteForge configuration `server_model` <- primary row `ProductDescription`
|
||||
- QuoteForge configuration `article` or `support_code` <- primary row `ProprietaryProductIdentifier`
|
||||
- QuoteForge configuration `line` <- stable order by group appearance in the workspace
|
||||
|
||||
Project-level fields such as QuoteForge `code`, `name`, and `variant` are not reliably defined by `CFXML`
|
||||
itself and should come from the existing target project context or explicit user input.
|
||||
|
||||
### Mapping to `vendor_spec`
|
||||
|
||||
The importer must build one combined `vendor_spec` array per configuration group.
|
||||
|
||||
Source rows:
|
||||
|
||||
- all `ProductSubLineItem` rows from the primary top-level row;
|
||||
- all `ProductSubLineItem` rows from every non-primary top-level row in the same group;
|
||||
- if a top-level row has no `ProductSubLineItem`, the top-level row itself may be converted into one
|
||||
`vendor_spec` row so that software-only content is not lost.
|
||||
|
||||
Each imported row maps into one `VendorSpecItem`:
|
||||
|
||||
- `sort_order` <- stable sequence within the group
|
||||
- `vendor_partnumber` <- `ProprietaryProductIdentifier`
|
||||
- `quantity` <- `Quantity`
|
||||
- `description` <- `ProductDescription`
|
||||
- `unit_price` <- `UnitListPrice.FinancialAmount.MonetaryAmount` when present
|
||||
- `total_price` <- `quantity * unit_price` when unit price is present
|
||||
- `lot_mappings` <- resolved immediately from the active partnumber book using `lots_json`
|
||||
|
||||
The importer stores vendor-native rows in `vendor_spec`, then immediately runs the same logical flow as BOM
|
||||
Resolve + Apply:
|
||||
|
||||
- resolve vendor PN rows through the active partnumber book
|
||||
- persist canonical `lot_mappings[]`
|
||||
- build normalized configuration `items` from `row.quantity * quantity_per_pn`
|
||||
- fill `items.unit_price` from the latest local `estimate` pricelist
|
||||
- recalculate configuration `total_price`
|
||||
|
||||
### Import Pipeline
|
||||
|
||||
Recommended parser pipeline:
|
||||
|
||||
1. Parse XML into top-level `ProductLineItem` rows.
|
||||
2. Group rows by `ProprietaryGroupIdentifier`.
|
||||
3. Select one primary row per group using structural rules.
|
||||
4. Build one QuoteForge configuration DTO per group.
|
||||
5. Merge all hardware/software rows of the group into one `vendor_spec`.
|
||||
6. Resolve imported PN rows into canonical `lot_mappings[]` using the active partnumber book.
|
||||
7. Build configuration `items` from resolved `lot_mappings[]`.
|
||||
8. Price those `items` from the latest local `estimate` pricelist.
|
||||
9. Save or update the QuoteForge configuration inside the existing project.
|
||||
|
||||
### Recommended Internal DTO
|
||||
|
||||
```go
|
||||
type ImportedProject struct {
|
||||
SourceFormat string
|
||||
SourceFilePath string
|
||||
SourceDocID string
|
||||
|
||||
Code string
|
||||
Name string
|
||||
Variant string
|
||||
|
||||
Configurations []ImportedConfiguration
|
||||
}
|
||||
|
||||
type ImportedConfiguration struct {
|
||||
GroupID string
|
||||
|
||||
Name string
|
||||
Line int
|
||||
ServerCount int
|
||||
|
||||
ServerModel string
|
||||
Article string
|
||||
SupportCode string
|
||||
CurrencyCode string
|
||||
|
||||
TopLevelRows []ImportedTopLevelRow
|
||||
VendorSpec []ImportedVendorRow
|
||||
}
|
||||
|
||||
type ImportedTopLevelRow struct {
|
||||
ProductLineNumber string
|
||||
ItemNo string
|
||||
GroupID string
|
||||
|
||||
ProductType string
|
||||
ProductCode string
|
||||
ProductName string
|
||||
Description string
|
||||
Quantity int
|
||||
UnitPrice *float64
|
||||
IsPrimary bool
|
||||
|
||||
SubRows []ImportedVendorRow
|
||||
}
|
||||
|
||||
type ImportedVendorRow struct {
|
||||
SortOrder int
|
||||
|
||||
SourceLineNumber string
|
||||
SourceParentLine string
|
||||
SourceProductType string
|
||||
|
||||
VendorPartnumber string
|
||||
Description string
|
||||
Quantity int
|
||||
UnitPrice *float64
|
||||
TotalPrice *float64
|
||||
|
||||
ProductCharacter string
|
||||
ProductCharPath string
|
||||
}
|
||||
```
|
||||
|
||||
### Current Product Assumption
|
||||
|
||||
For QuoteForge product behavior, the correct user-facing interpretation is:
|
||||
|
||||
- one external project/workspace contains several configurations;
|
||||
- each configuration contains both hardware and software rows that belong to it;
|
||||
- the importer must preserve that grouping exactly.
|
||||
|
||||
---
|
||||
|
||||
## Qty Aggregation Logic
|
||||
|
||||
After resolution, qty per LOT is computed from the BOM row quantity multiplied by the matched `lots_json.qty`:
|
||||
|
||||
```
|
||||
qty(lot) = SUM(quantity_of_pn_row * quantity_of_lot_inside_lots_json)
|
||||
```
|
||||
|
||||
Examples (book: PN_X → `[{LOT_A, qty:2}, {LOT_B, qty:1}]`):
|
||||
- BOM: PN_X ×3 → `LOT_A ×6`, `LOT_B ×3`
|
||||
- BOM: PN_X ×1 and PN_X ×2 → `LOT_A ×6`, `LOT_B ×3`
|
||||
|
||||
---
|
||||
|
||||
## UI: Three Top-Level Tabs
|
||||
|
||||
The configurator (`/configurator`) has three tabs:
|
||||
|
||||
1. **Estimate** — existing cart/component configurator (unchanged).
|
||||
2. **BOM** — paste/import vendor BOM, manual column mapping, LOT matching, bundle decomposition (`1 PN -> multiple LOTs`), "Пересчитать эстимейт", "Очистить".
|
||||
3. **Ценообразование** — pricing summary table + custom price input.
|
||||
|
||||
BOM data is shared between tabs 2 and 3.
|
||||
|
||||
### BOM Import UI (raw table, manual column mapping)
|
||||
|
||||
After paste (`Ctrl+V`) QuoteForge renders an editable raw table (not auto-detected parsing).
|
||||
|
||||
- The pasted rows are shown **as-is** (including header rows, if present).
|
||||
- The user selects a type for each column manually:
|
||||
- `P/N`
|
||||
- `Кол-во`
|
||||
- `Цена`
|
||||
- `Описание`
|
||||
- `Не использовать`
|
||||
- Required mapping:
|
||||
- exactly one `P/N`
|
||||
- exactly one `Кол-во`
|
||||
- Optional mapping:
|
||||
- `Цена` (0..1)
|
||||
- `Описание` (0..1)
|
||||
- Rows can be:
|
||||
- ignored (UI-only, excluded from `vendor_spec`)
|
||||
- deleted
|
||||
- Raw cells are editable inline after paste.
|
||||
|
||||
Notes:
|
||||
- There is **no auto column detection**.
|
||||
- There is **no auto header-row skip**.
|
||||
- Raw import layout itself is not stored on server; only normalized `vendor_spec` is stored.
|
||||
|
||||
### LOT matching in BOM table
|
||||
|
||||
The BOM table adds service columns on the right:
|
||||
|
||||
- `LOT`
|
||||
- `LOT в 1 PN`
|
||||
- actions (`+`, ignore, delete)
|
||||
|
||||
`LOT` behavior:
|
||||
- The first LOT row shown in the BOM UI is the primary LOT mapping for the PN row.
|
||||
- Additional LOT rows are added via the `+` action.
|
||||
- inline LOT input is strict:
|
||||
- autocomplete source = full local components list (`/api/components?per_page=5000`)
|
||||
- free text that does not match an existing LOT is rejected
|
||||
|
||||
`LOT в 1 PN` behavior:
|
||||
- quantity multiplier for each visible LOT row in BOM (`quantity_per_pn` in persisted `lot_mappings[]`)
|
||||
- default = `1`
|
||||
- editable inline
|
||||
|
||||
### Bundle mode (`1 PN -> multiple LOTs`)
|
||||
|
||||
The `+` action in BOM rows adds an extra LOT mapping row for the same vendor PN row.
|
||||
|
||||
- All visible LOT rows (first + added rows) are persisted uniformly in `lot_mappings[]`
|
||||
- Each mapping row has:
|
||||
- LOT
|
||||
- qty (`LOT in 1 PN` = `quantity_per_pn`)
|
||||
|
||||
### BOM restore on config open
|
||||
|
||||
On config open, QuoteForge loads `vendor_spec` from server and reconstructs the editable BOM table in normalized form:
|
||||
|
||||
- columns restored as: `Qty | P/N | Description | Price`
|
||||
- column mapping restored as:
|
||||
- `qty`, `pn`, `description`, `price`
|
||||
- LOT / `LOT в 1 PN` rows are restored from `vendor_spec.lot_mappings[]`
|
||||
|
||||
This restores the BOM editing state, but not the original raw Excel layout (extra columns, ignored rows, original headers).
|
||||
|
||||
### Pricing Tab: column order
|
||||
|
||||
```
|
||||
LOT | PN вендора | Описание | Кол-во | Estimate | Цена вендора | Склад | Конк.
|
||||
```
|
||||
|
||||
**If BOM is empty** — the pricing tab still renders, using cart items as rows (PN вендора = "—", Цена вендора = "—").
|
||||
|
||||
**Description source priority:** BOM row description → LOT description from `local_components`.
|
||||
|
||||
### Pricing Tab: BOM + Estimate merge behavior
|
||||
|
||||
When BOM exists, the pricing tab renders:
|
||||
|
||||
- BOM-based rows (including rows resolved via manual LOT and bundle mappings)
|
||||
- plus **Estimate-only LOTs** (rows currently in cart but not covered by BOM mappings)
|
||||
|
||||
Estimate-only rows are shown as separate rows with:
|
||||
- `PN вендора = "—"`
|
||||
- vendor price = `—`
|
||||
- description from local components
|
||||
|
||||
### Pricing Tab: "Своя цена" input
|
||||
|
||||
- Manual entry → proportionally redistributes custom price into "Цена вендора" cells (proportional to each row's Estimate share). Last row absorbs rounding remainder.
|
||||
- "Проставить цены BOM" button → restores per-row original BOM prices directly (no proportional redistribution). Sets "Своя цена" to their sum.
|
||||
- Both paths show "Скидка от Estimate: X%" info.
|
||||
- "Экспорт CSV" button → downloads `pricing_<uuid>.csv` with UTF-8 BOM, same column order as table, plus Итого row.
|
||||
|
||||
---
|
||||
|
||||
## API Endpoints
|
||||
|
||||
| Method | URL | Description |
|
||||
|--------|-----|-------------|
|
||||
| GET | `/api/configs/:uuid/vendor-spec` | Fetch stored BOM |
|
||||
| PUT | `/api/configs/:uuid/vendor-spec` | Replace BOM (full update) |
|
||||
| POST | `/api/configs/:uuid/vendor-spec/resolve` | Resolve PNs → LOTs (no cart mutation) |
|
||||
| POST | `/api/configs/:uuid/vendor-spec/apply` | Apply resolved LOTs to cart |
|
||||
| POST | `/api/projects/:uuid/vendor-import` | Import `CFXML` workspace into an existing project and create grouped configurations |
|
||||
| GET | `/api/partnumber-books` | List local book snapshots |
|
||||
| GET | `/api/partnumber-books/:id` | Items for a book by server_id |
|
||||
| POST | `/api/sync/partnumber-books` | Pull book snapshots from MariaDB |
|
||||
| POST | `/api/sync/partnumber-seen` | Push unresolved PNs to `qt_vendor_partnumber_seen` on MariaDB |
|
||||
|
||||
## Unresolved PN Tracking (`qt_vendor_partnumber_seen`)
|
||||
|
||||
After each `resolveBOM()` call, QuoteForge pushes PN rows to `POST /api/sync/partnumber-seen` (fire-and-forget from JS — errors silently ignored):
|
||||
|
||||
- unresolved BOM rows (`ignored = false`)
|
||||
- raw BOM rows explicitly marked as ignored in UI (`ignored = true`) — these rows are **not** saved to `vendor_spec`, but are reported for server-side tracking
|
||||
|
||||
The handler calls `sync.PushPartnumberSeen()` which inserts into `qt_vendor_partnumber_seen`.
|
||||
If a row with the same `partnumber` already exists, QuoteForge must leave it untouched:
|
||||
|
||||
- do not update `last_seen_at`
|
||||
- do not update `is_ignored`
|
||||
- do not update `description`
|
||||
|
||||
Canonical insert behavior:
|
||||
|
||||
```sql
|
||||
INSERT INTO qt_vendor_partnumber_seen (source_type, vendor, partnumber, description, is_ignored, last_seen_at)
|
||||
VALUES ('manual', '', ?, ?, ?, NOW())
|
||||
ON DUPLICATE KEY UPDATE
|
||||
partnumber = partnumber
|
||||
```
|
||||
|
||||
Uniqueness key: `partnumber` only (after PriceForge migration 025). PriceForge uses this table for populating the partnumber book.
|
||||
|
||||
Partnumber book sync contract:
|
||||
|
||||
- PriceForge writes membership snapshots to `qt_partnumber_books.partnumbers_json`.
|
||||
- PriceForge writes canonical PN payloads to `qt_partnumber_book_items`.
|
||||
- QuoteForge syncs book headers first, then pulls PN payloads with:
|
||||
`SELECT partnumber, lots_json, description FROM qt_partnumber_book_items WHERE partnumber IN (...)`
|
||||
|
||||
## BOM Persistence
|
||||
|
||||
- `vendor_spec` is saved to server via `PUT /api/configs/:uuid/vendor-spec`.
|
||||
- `GET` / `PUT` `vendor_spec` must preserve row-level mapping fields used by the UI:
|
||||
- `lot_mappings[]`
|
||||
- each item: `lot_name`, `quantity_per_pn`
|
||||
- `description` is persisted in each BOM row and is used by the Pricing tab when available.
|
||||
- Ignored raw rows are **not** persisted into `vendor_spec`.
|
||||
- The PUT handler explicitly marshals `VendorSpec` to JSON string before passing to GORM `Update` (GORM does not reliably call `driver.Valuer` for custom types in `Update(column, value)`).
|
||||
- BOM is autosaved (debounced) after BOM-changing actions, including:
|
||||
- `resolveBOM()`
|
||||
- LOT row qty (`LOT в 1 PN`) changes
|
||||
- LOT row add/remove (`+` / delete in bundle context)
|
||||
- "Сохранить BOM" button triggers explicit save.
|
||||
|
||||
## Pricing Tab: Estimate Price Source
|
||||
|
||||
`renderPricingTab()` is async. It calls `POST /api/quote/price-levels` with LOTs collected from:
|
||||
|
||||
- `lot_mappings[]` from BOM rows
|
||||
- current Estimate/cart LOTs not covered by BOM mappings (to show estimate-only rows)
|
||||
|
||||
This ensures Estimate prices appear for:
|
||||
|
||||
- manually matched LOTs in the BOM tab
|
||||
- bundle LOTs
|
||||
- LOTs already present in Estimate but not mapped from BOM
|
||||
|
||||
### Apply to Estimate (`Пересчитать эстимейт`)
|
||||
|
||||
When applying BOM to Estimate, QuoteForge builds cart rows from explicit UI mappings stored in `lot_mappings[]`.
|
||||
|
||||
For a BOM row with PN qty = `Q`:
|
||||
|
||||
- each mapped LOT contributes `Q * quantity_per_pn`
|
||||
|
||||
Rows without any valid LOT mapping are skipped.
|
||||
|
||||
## Web Route
|
||||
|
||||
| Route | Page |
|
||||
|-------|------|
|
||||
| `/partnumber-books` | Partnumber books — active book summary (unique LOTs, total PN, primary PN count), searchable items table, collapsible snapshot history |
|
||||
- accepted file field is `file`;
|
||||
- maximum file size is `1 GiB`;
|
||||
- one `ProprietaryGroupIdentifier` becomes one QuoteForge configuration;
|
||||
- software rows stay inside their hardware group and never become standalone configurations;
|
||||
- primary group row is selected structurally, without vendor-specific SKU hardcoding;
|
||||
- imported configuration order follows workspace order.
|
||||
|
||||
Imported configuration fields:
|
||||
- `name` from primary row `ProductName`
|
||||
- `server_count` from primary row `Quantity`
|
||||
- `server_model` from primary row `ProductDescription`
|
||||
- `article` or `support_code` from `ProprietaryProductIdentifier`
|
||||
|
||||
Imported BOM rows become `vendor_spec` rows and are resolved through the active local partnumber book when possible.
|
||||
|
||||
@@ -1,55 +1,30 @@
|
||||
# QuoteForge Bible — Architectural Documentation
|
||||
# QuoteForge Bible
|
||||
|
||||
The single source of truth for architecture, schemas, and patterns.
|
||||
Project-specific architecture and operational contracts.
|
||||
|
||||
---
|
||||
## Files
|
||||
|
||||
## Table of Contents
|
||||
| File | Scope |
|
||||
| --- | --- |
|
||||
| [01-overview.md](01-overview.md) | Product scope, runtime model, repository map |
|
||||
| [02-architecture.md](02-architecture.md) | Local-first rules, sync, pricing, versioning |
|
||||
| [03-database.md](03-database.md) | SQLite and MariaDB data model, permissions, migrations |
|
||||
| [04-api.md](04-api.md) | HTTP routes and API contract |
|
||||
| [05-config.md](05-config.md) | Runtime config, paths, env vars, startup behavior |
|
||||
| [06-backup.md](06-backup.md) | Backup contract and restore workflow |
|
||||
| [07-dev.md](07-dev.md) | Development commands and guardrails |
|
||||
| [09-vendor-spec.md](09-vendor-spec.md) | Vendor BOM and CFXML import contract |
|
||||
|
||||
| File | Topic |
|
||||
|------|-------|
|
||||
| [01-overview.md](01-overview.md) | Product: purpose, features, tech stack, repository structure |
|
||||
| [02-architecture.md](02-architecture.md) | Architecture: local-first, sync, pricing, versioning |
|
||||
| [03-database.md](03-database.md) | DB schemas: SQLite + MariaDB, permissions, indexes |
|
||||
| [04-api.md](04-api.md) | API endpoints and web routes |
|
||||
| [05-config.md](05-config.md) | Configuration, environment variables, paths, installation |
|
||||
| [06-backup.md](06-backup.md) | Backup: implementation, rotation policy |
|
||||
| [07-dev.md](07-dev.md) | Development: commands, code style, guardrails |
|
||||
## Rules
|
||||
|
||||
---
|
||||
- `bible-local/` is the source of truth for QuoteForge-specific behavior.
|
||||
- Keep these files in English.
|
||||
- Update the matching file in the same commit as any architectural change.
|
||||
- Remove stale documentation instead of preserving history in place.
|
||||
|
||||
## Bible Rules
|
||||
## Quick reference
|
||||
|
||||
> **Every architectural decision must be recorded in the Bible.**
|
||||
>
|
||||
> Any change to DB schema, data access patterns, sync behavior, API contracts,
|
||||
> configuration format, or any other system-level aspect — the corresponding `bible/` file
|
||||
> **must be updated in the same commit** as the code.
|
||||
>
|
||||
> On every user-requested commit, the Bible must be reviewed and updated in that commit.
|
||||
>
|
||||
> The Bible is the single source of truth for architecture. Outdated documentation is worse than none.
|
||||
|
||||
> **Documentation language: English.**
|
||||
>
|
||||
> All files in `bible/` are written and updated **in English only**.
|
||||
> Mixing languages is not allowed.
|
||||
|
||||
---
|
||||
|
||||
## Quick Reference
|
||||
|
||||
**Where is user data stored?**
|
||||
SQLite → `~/Library/Application Support/QuoteForge/qfs.db` (macOS). MariaDB is sync-only.
|
||||
|
||||
**How to look up a price for a line item?**
|
||||
`local_pricelist_items` → by `pricelist_id` from config + `lot_name`. Prices are **never** taken from `local_components`.
|
||||
|
||||
**Pre-commit check?**
|
||||
`go build ./cmd/qfs && go vet ./...`
|
||||
|
||||
**What must never be restored?**
|
||||
cron jobs, admin pricing, alerts, stock import, importer utility — all removed intentionally.
|
||||
|
||||
**Where is the release changelog?**
|
||||
`releases/memory/v{major}.{minor}.{patch}.md`
|
||||
- Local DB path: see [05-config.md](05-config.md)
|
||||
- Runtime bind: loopback only
|
||||
- Local backups: see [06-backup.md](06-backup.md)
|
||||
- Release notes: `releases/<version>/RELEASE_NOTES.md`
|
||||
|
||||
173
cmd/migrate_project_updated_at/main.go
Normal file
173
cmd/migrate_project_updated_at/main.go
Normal file
@@ -0,0 +1,173 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"flag"
|
||||
"fmt"
|
||||
"log"
|
||||
"sort"
|
||||
"time"
|
||||
|
||||
"git.mchus.pro/mchus/quoteforge/internal/appstate"
|
||||
"git.mchus.pro/mchus/quoteforge/internal/localdb"
|
||||
"git.mchus.pro/mchus/quoteforge/internal/models"
|
||||
"gorm.io/driver/mysql"
|
||||
"gorm.io/gorm"
|
||||
"gorm.io/gorm/logger"
|
||||
)
|
||||
|
||||
type projectTimestampRow struct {
|
||||
UUID string
|
||||
UpdatedAt time.Time
|
||||
}
|
||||
|
||||
type updatePlanRow struct {
|
||||
UUID string
|
||||
Code string
|
||||
Variant string
|
||||
LocalUpdatedAt time.Time
|
||||
ServerUpdatedAt time.Time
|
||||
}
|
||||
|
||||
func main() {
|
||||
defaultLocalDBPath, err := appstate.ResolveDBPath("")
|
||||
if err != nil {
|
||||
log.Fatalf("failed to resolve default local SQLite path: %v", err)
|
||||
}
|
||||
|
||||
localDBPath := flag.String("localdb", defaultLocalDBPath, "path to local SQLite database (default: user state dir or QFS_DB_PATH)")
|
||||
apply := flag.Bool("apply", false, "apply updates to local SQLite (default is preview only)")
|
||||
flag.Parse()
|
||||
|
||||
local, err := localdb.New(*localDBPath)
|
||||
if err != nil {
|
||||
log.Fatalf("failed to initialize local database: %v", err)
|
||||
}
|
||||
defer local.Close()
|
||||
|
||||
if !local.HasSettings() {
|
||||
log.Fatalf("SQLite connection settings are not configured. Run qfs setup first.")
|
||||
}
|
||||
|
||||
dsn, err := local.GetDSN()
|
||||
if err != nil {
|
||||
log.Fatalf("failed to build DSN from SQLite settings: %v", err)
|
||||
}
|
||||
|
||||
db, err := gorm.Open(mysql.Open(dsn), &gorm.Config{
|
||||
Logger: logger.Default.LogMode(logger.Silent),
|
||||
})
|
||||
if err != nil {
|
||||
log.Fatalf("failed to connect to MariaDB: %v", err)
|
||||
}
|
||||
|
||||
serverRows, err := loadServerProjects(db)
|
||||
if err != nil {
|
||||
log.Fatalf("failed to load server projects: %v", err)
|
||||
}
|
||||
|
||||
localProjects, err := local.GetAllProjects(true)
|
||||
if err != nil {
|
||||
log.Fatalf("failed to load local projects: %v", err)
|
||||
}
|
||||
|
||||
plan := buildUpdatePlan(localProjects, serverRows)
|
||||
printPlan(plan, *apply)
|
||||
|
||||
if !*apply || len(plan) == 0 {
|
||||
return
|
||||
}
|
||||
|
||||
updated := 0
|
||||
for i := range plan {
|
||||
project, err := local.GetProjectByUUID(plan[i].UUID)
|
||||
if err != nil {
|
||||
log.Printf("skip %s: load local project: %v", plan[i].UUID, err)
|
||||
continue
|
||||
}
|
||||
project.UpdatedAt = plan[i].ServerUpdatedAt
|
||||
if err := local.SaveProjectPreservingUpdatedAt(project); err != nil {
|
||||
log.Printf("skip %s: save local project: %v", plan[i].UUID, err)
|
||||
continue
|
||||
}
|
||||
updated++
|
||||
}
|
||||
|
||||
log.Printf("updated %d local project timestamps", updated)
|
||||
}
|
||||
|
||||
func loadServerProjects(db *gorm.DB) (map[string]time.Time, error) {
|
||||
var rows []projectTimestampRow
|
||||
if err := db.Model(&models.Project{}).
|
||||
Select("uuid, updated_at").
|
||||
Find(&rows).Error; err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
out := make(map[string]time.Time, len(rows))
|
||||
for _, row := range rows {
|
||||
if row.UUID == "" {
|
||||
continue
|
||||
}
|
||||
out[row.UUID] = row.UpdatedAt
|
||||
}
|
||||
return out, nil
|
||||
}
|
||||
|
||||
func buildUpdatePlan(localProjects []localdb.LocalProject, serverRows map[string]time.Time) []updatePlanRow {
|
||||
plan := make([]updatePlanRow, 0)
|
||||
for i := range localProjects {
|
||||
project := localProjects[i]
|
||||
serverUpdatedAt, ok := serverRows[project.UUID]
|
||||
if !ok {
|
||||
continue
|
||||
}
|
||||
if project.UpdatedAt.Equal(serverUpdatedAt) {
|
||||
continue
|
||||
}
|
||||
plan = append(plan, updatePlanRow{
|
||||
UUID: project.UUID,
|
||||
Code: project.Code,
|
||||
Variant: project.Variant,
|
||||
LocalUpdatedAt: project.UpdatedAt,
|
||||
ServerUpdatedAt: serverUpdatedAt,
|
||||
})
|
||||
}
|
||||
|
||||
sort.Slice(plan, func(i, j int) bool {
|
||||
if plan[i].Code != plan[j].Code {
|
||||
return plan[i].Code < plan[j].Code
|
||||
}
|
||||
return plan[i].Variant < plan[j].Variant
|
||||
})
|
||||
|
||||
return plan
|
||||
}
|
||||
|
||||
func printPlan(plan []updatePlanRow, apply bool) {
|
||||
mode := "preview"
|
||||
if apply {
|
||||
mode = "apply"
|
||||
}
|
||||
log.Printf("project updated_at resync mode=%s changes=%d", mode, len(plan))
|
||||
if len(plan) == 0 {
|
||||
log.Printf("no local project timestamps need resync")
|
||||
return
|
||||
}
|
||||
for _, row := range plan {
|
||||
variant := row.Variant
|
||||
if variant == "" {
|
||||
variant = "main"
|
||||
}
|
||||
log.Printf("%s [%s] local=%s server=%s", row.Code, variant, formatStamp(row.LocalUpdatedAt), formatStamp(row.ServerUpdatedAt))
|
||||
}
|
||||
if !apply {
|
||||
fmt.Println("Re-run with -apply to write server updated_at into local SQLite.")
|
||||
}
|
||||
}
|
||||
|
||||
func formatStamp(value time.Time) string {
|
||||
if value.IsZero() {
|
||||
return "zero"
|
||||
}
|
||||
return value.Format(time.RFC3339)
|
||||
}
|
||||
@@ -39,6 +39,10 @@ logging:
|
||||
t.Fatalf("load legacy config: %v", err)
|
||||
}
|
||||
setConfigDefaults(cfg)
|
||||
cfg.Server.Host, _, err = normalizeLoopbackServerHost(cfg.Server.Host)
|
||||
if err != nil {
|
||||
t.Fatalf("normalize server host: %v", err)
|
||||
}
|
||||
if err := migrateConfigFileToRuntimeShape(path, cfg); err != nil {
|
||||
t.Fatalf("migrate config: %v", err)
|
||||
}
|
||||
@@ -60,7 +64,43 @@ logging:
|
||||
if !strings.Contains(text, "port: 9191") {
|
||||
t.Fatalf("migrated config did not preserve server port:\n%s", text)
|
||||
}
|
||||
if !strings.Contains(text, "host: 127.0.0.1") {
|
||||
t.Fatalf("migrated config did not normalize server host:\n%s", text)
|
||||
}
|
||||
if !strings.Contains(text, "level: debug") {
|
||||
t.Fatalf("migrated config did not preserve logging level:\n%s", text)
|
||||
}
|
||||
}
|
||||
|
||||
func TestNormalizeLoopbackServerHost(t *testing.T) {
|
||||
t.Parallel()
|
||||
|
||||
cases := []struct {
|
||||
host string
|
||||
want string
|
||||
wantChanged bool
|
||||
wantErr bool
|
||||
}{
|
||||
{host: "127.0.0.1", want: "127.0.0.1", wantChanged: false, wantErr: false},
|
||||
{host: "localhost", want: "127.0.0.1", wantChanged: true, wantErr: false},
|
||||
{host: "::1", want: "127.0.0.1", wantChanged: true, wantErr: false},
|
||||
{host: "0.0.0.0", want: "127.0.0.1", wantChanged: true, wantErr: false},
|
||||
{host: "192.168.1.10", want: "127.0.0.1", wantChanged: true, wantErr: false},
|
||||
}
|
||||
|
||||
for _, tc := range cases {
|
||||
got, changed, err := normalizeLoopbackServerHost(tc.host)
|
||||
if tc.wantErr && err == nil {
|
||||
t.Fatalf("expected error for host %q", tc.host)
|
||||
}
|
||||
if !tc.wantErr && err != nil {
|
||||
t.Fatalf("unexpected error for host %q: %v", tc.host, err)
|
||||
}
|
||||
if got != tc.want {
|
||||
t.Fatalf("unexpected normalized host for %q: got %q want %q", tc.host, got, tc.want)
|
||||
}
|
||||
if changed != tc.wantChanged {
|
||||
t.Fatalf("unexpected changed flag for %q: got %t want %t", tc.host, changed, tc.wantChanged)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
276
cmd/qfs/main.go
276
cmd/qfs/main.go
@@ -10,6 +10,7 @@ import (
|
||||
"io/fs"
|
||||
"log/slog"
|
||||
"math"
|
||||
"net"
|
||||
"net/http"
|
||||
"os"
|
||||
"os/exec"
|
||||
@@ -43,11 +44,16 @@ import (
|
||||
|
||||
// Version is set via ldflags during build
|
||||
var Version = "dev"
|
||||
var errVendorImportTooLarge = errors.New("vendor workspace file exceeds 1 GiB limit")
|
||||
|
||||
const backgroundSyncInterval = 5 * time.Minute
|
||||
const onDemandPullCooldown = 30 * time.Second
|
||||
const startupConsoleWarning = "Не закрывайте консоль иначе приложение не будет работать"
|
||||
|
||||
var vendorImportMaxBytes int64 = 1 << 30
|
||||
|
||||
const vendorImportMultipartOverheadBytes int64 = 8 << 20
|
||||
|
||||
func main() {
|
||||
showStartupConsoleWarning()
|
||||
|
||||
@@ -142,6 +148,15 @@ func main() {
|
||||
}
|
||||
}
|
||||
setConfigDefaults(cfg)
|
||||
normalizedHost, changed, err := normalizeLoopbackServerHost(cfg.Server.Host)
|
||||
if err != nil {
|
||||
slog.Error("invalid server host", "host", cfg.Server.Host, "error", err)
|
||||
os.Exit(1)
|
||||
}
|
||||
if changed {
|
||||
slog.Warn("corrected server host to loopback", "from", cfg.Server.Host, "to", normalizedHost)
|
||||
}
|
||||
cfg.Server.Host = normalizedHost
|
||||
if err := migrateConfigFileToRuntimeShape(resolvedConfigPath, cfg); err != nil {
|
||||
slog.Error("failed to migrate config file format", "path", resolvedConfigPath, "error", err)
|
||||
os.Exit(1)
|
||||
@@ -319,29 +334,47 @@ func setConfigDefaults(cfg *config.Config) {
|
||||
if cfg.Server.WriteTimeout == 0 {
|
||||
cfg.Server.WriteTimeout = 30 * time.Second
|
||||
}
|
||||
if cfg.Pricing.DefaultMethod == "" {
|
||||
cfg.Pricing.DefaultMethod = "weighted_median"
|
||||
}
|
||||
if cfg.Pricing.DefaultPeriodDays == 0 {
|
||||
cfg.Pricing.DefaultPeriodDays = 90
|
||||
}
|
||||
if cfg.Pricing.FreshnessGreenDays == 0 {
|
||||
cfg.Pricing.FreshnessGreenDays = 30
|
||||
}
|
||||
if cfg.Pricing.FreshnessYellowDays == 0 {
|
||||
cfg.Pricing.FreshnessYellowDays = 60
|
||||
}
|
||||
if cfg.Pricing.FreshnessRedDays == 0 {
|
||||
cfg.Pricing.FreshnessRedDays = 90
|
||||
}
|
||||
if cfg.Pricing.MinQuotesForMedian == 0 {
|
||||
cfg.Pricing.MinQuotesForMedian = 3
|
||||
}
|
||||
if cfg.Backup.Time == "" {
|
||||
cfg.Backup.Time = "00:00"
|
||||
}
|
||||
}
|
||||
|
||||
func normalizeLoopbackServerHost(host string) (string, bool, error) {
|
||||
trimmed := strings.TrimSpace(host)
|
||||
if trimmed == "" {
|
||||
return "", false, fmt.Errorf("server.host must not be empty")
|
||||
}
|
||||
const loopbackHost = "127.0.0.1"
|
||||
if trimmed == loopbackHost {
|
||||
return loopbackHost, false, nil
|
||||
}
|
||||
if strings.EqualFold(trimmed, "localhost") {
|
||||
return loopbackHost, true, nil
|
||||
}
|
||||
|
||||
ip := net.ParseIP(strings.Trim(trimmed, "[]"))
|
||||
if ip != nil {
|
||||
if ip.IsLoopback() || ip.IsUnspecified() {
|
||||
return loopbackHost, trimmed != loopbackHost, nil
|
||||
}
|
||||
return loopbackHost, true, nil
|
||||
}
|
||||
|
||||
return loopbackHost, true, nil
|
||||
}
|
||||
|
||||
func vendorImportBodyLimit() int64 {
|
||||
return vendorImportMaxBytes + vendorImportMultipartOverheadBytes
|
||||
}
|
||||
|
||||
func isVendorImportTooLarge(fileSize int64, err error) bool {
|
||||
if fileSize > vendorImportMaxBytes {
|
||||
return true
|
||||
}
|
||||
var maxBytesErr *http.MaxBytesError
|
||||
return errors.As(err, &maxBytesErr)
|
||||
}
|
||||
|
||||
func ensureDefaultConfigFile(configPath string) error {
|
||||
if strings.TrimSpace(configPath) == "" {
|
||||
return fmt.Errorf("config path is empty")
|
||||
@@ -747,6 +780,7 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
pricelistHandler := handlers.NewPricelistHandler(local)
|
||||
vendorSpecHandler := handlers.NewVendorSpecHandler(local)
|
||||
partnumberBooksHandler := handlers.NewPartnumberBooksHandler(local)
|
||||
respondError := handlers.RespondError
|
||||
syncHandler, err := handlers.NewSyncHandler(local, syncService, connMgr, templatesPath, backgroundSyncInterval)
|
||||
if err != nil {
|
||||
return nil, nil, fmt.Errorf("creating sync handler: %w", err)
|
||||
@@ -766,6 +800,7 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
|
||||
// Router
|
||||
router := gin.New()
|
||||
router.MaxMultipartMemory = vendorImportBodyLimit()
|
||||
router.Use(gin.Recovery())
|
||||
router.Use(requestLogger())
|
||||
router.Use(middleware.CORS())
|
||||
@@ -786,17 +821,17 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
})
|
||||
})
|
||||
|
||||
// Restart endpoint (for development purposes)
|
||||
router.POST("/api/restart", func(c *gin.Context) {
|
||||
// This will cause the server to restart by exiting
|
||||
// The restartProcess function will be called to restart the process
|
||||
slog.Info("Restart requested via API")
|
||||
go func() {
|
||||
time.Sleep(100 * time.Millisecond)
|
||||
restartProcess()
|
||||
}()
|
||||
c.JSON(http.StatusOK, gin.H{"message": "restarting..."})
|
||||
})
|
||||
// Restart endpoint is intentionally debug-only.
|
||||
if cfg.Server.Mode == "debug" {
|
||||
router.POST("/api/restart", func(c *gin.Context) {
|
||||
slog.Info("Restart requested via API")
|
||||
go func() {
|
||||
time.Sleep(100 * time.Millisecond)
|
||||
restartProcess()
|
||||
}()
|
||||
c.JSON(http.StatusOK, gin.H{"message": "restarting..."})
|
||||
})
|
||||
}
|
||||
|
||||
// DB status endpoint
|
||||
router.GET("/api/db-status", func(c *gin.Context) {
|
||||
@@ -928,7 +963,7 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
|
||||
cfgs, total, err := configService.ListAllWithStatus(page, perPage, status, search)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -949,7 +984,7 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
c.JSON(http.StatusServiceUnavailable, gin.H{"error": "Database is offline"})
|
||||
return
|
||||
}
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
c.JSON(http.StatusOK, result)
|
||||
@@ -958,13 +993,13 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
configs.POST("", func(c *gin.Context) {
|
||||
var req services.CreateConfigRequest
|
||||
if err := c.ShouldBindJSON(&req); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
|
||||
config, err := configService.Create(dbUsername, &req)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -974,12 +1009,12 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
configs.POST("/preview-article", func(c *gin.Context) {
|
||||
var req services.ArticlePreviewRequest
|
||||
if err := c.ShouldBindJSON(&req); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
result, err := configService.BuildArticlePreview(&req)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
c.JSON(http.StatusOK, gin.H{
|
||||
@@ -1002,7 +1037,7 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
uuid := c.Param("uuid")
|
||||
var req services.CreateConfigRequest
|
||||
if err := c.ShouldBindJSON(&req); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -1010,13 +1045,13 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
if err != nil {
|
||||
switch {
|
||||
case errors.Is(err, services.ErrConfigNotFound):
|
||||
c.JSON(http.StatusNotFound, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusNotFound, "resource not found", err)
|
||||
case errors.Is(err, services.ErrProjectNotFound):
|
||||
c.JSON(http.StatusNotFound, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusNotFound, "resource not found", err)
|
||||
case errors.Is(err, services.ErrProjectForbidden):
|
||||
c.JSON(http.StatusForbidden, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusForbidden, "access denied", err)
|
||||
default:
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
}
|
||||
return
|
||||
}
|
||||
@@ -1027,7 +1062,7 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
configs.DELETE("/:uuid", func(c *gin.Context) {
|
||||
uuid := c.Param("uuid")
|
||||
if err := configService.DeleteNoAuth(uuid); err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
c.JSON(http.StatusOK, gin.H{"message": "archived"})
|
||||
@@ -1037,7 +1072,7 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
uuid := c.Param("uuid")
|
||||
config, err := configService.ReactivateNoAuth(uuid)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
c.JSON(http.StatusOK, gin.H{
|
||||
@@ -1052,13 +1087,13 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
Name string `json:"name"`
|
||||
}
|
||||
if err := c.ShouldBindJSON(&req); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
|
||||
config, err := configService.RenameNoAuth(uuid, req.Name)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -1072,7 +1107,7 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
FromVersion int `json:"from_version"`
|
||||
}
|
||||
if err := c.ShouldBindJSON(&req); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -1082,7 +1117,7 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
c.JSON(http.StatusNotFound, gin.H{"error": "version not found"})
|
||||
return
|
||||
}
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -1093,7 +1128,7 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
uuid := c.Param("uuid")
|
||||
config, err := configService.RefreshPricesNoAuth(uuid)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
c.JSON(http.StatusOK, config)
|
||||
@@ -1105,20 +1140,20 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
ProjectUUID string `json:"project_uuid"`
|
||||
}
|
||||
if err := c.ShouldBindJSON(&req); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
updated, err := configService.SetProjectNoAuth(uuid, req.ProjectUUID)
|
||||
if err != nil {
|
||||
switch {
|
||||
case errors.Is(err, services.ErrConfigNotFound):
|
||||
c.JSON(http.StatusNotFound, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusNotFound, "resource not found", err)
|
||||
case errors.Is(err, services.ErrProjectNotFound):
|
||||
c.JSON(http.StatusNotFound, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusNotFound, "resource not found", err)
|
||||
case errors.Is(err, services.ErrProjectForbidden):
|
||||
c.JSON(http.StatusForbidden, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusForbidden, "access denied", err)
|
||||
default:
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
}
|
||||
return
|
||||
}
|
||||
@@ -1147,7 +1182,7 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
case errors.Is(err, services.ErrInvalidVersionNumber):
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": "invalid paging params"})
|
||||
default:
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
}
|
||||
return
|
||||
}
|
||||
@@ -1175,7 +1210,7 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
case errors.Is(err, services.ErrConfigVersionNotFound):
|
||||
c.JSON(http.StatusNotFound, gin.H{"error": "version not found"})
|
||||
default:
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
}
|
||||
return
|
||||
}
|
||||
@@ -1190,7 +1225,7 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
Note string `json:"note"`
|
||||
}
|
||||
if err := c.ShouldBindJSON(&req); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
if req.TargetVersion <= 0 {
|
||||
@@ -1208,7 +1243,7 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
case errors.Is(err, services.ErrVersionConflict):
|
||||
c.JSON(http.StatusConflict, gin.H{"error": "version conflict"})
|
||||
default:
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
}
|
||||
return
|
||||
}
|
||||
@@ -1243,12 +1278,12 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
ServerCount int `json:"server_count" binding:"required,min=1"`
|
||||
}
|
||||
if err := c.ShouldBindJSON(&req); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
config, err := configService.UpdateServerCount(uuid, req.ServerCount)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
c.JSON(http.StatusOK, config)
|
||||
@@ -1293,7 +1328,7 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
|
||||
allProjects, err := projectService.ListByUser(dbUsername, true)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -1427,7 +1462,7 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
projects.GET("/all", func(c *gin.Context) {
|
||||
allProjects, err := projectService.ListByUser(dbUsername, true)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -1457,7 +1492,7 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
projects.POST("", func(c *gin.Context) {
|
||||
var req services.CreateProjectRequest
|
||||
if err := c.ShouldBindJSON(&req); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
if strings.TrimSpace(req.Code) == "" {
|
||||
@@ -1467,10 +1502,12 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
project, err := projectService.Create(dbUsername, &req)
|
||||
if err != nil {
|
||||
switch {
|
||||
case errors.Is(err, services.ErrReservedMainVariant):
|
||||
respondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
case errors.Is(err, services.ErrProjectCodeExists):
|
||||
c.JSON(http.StatusConflict, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusConflict, "conflict detected", err)
|
||||
default:
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
}
|
||||
return
|
||||
}
|
||||
@@ -1482,11 +1519,11 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
if err != nil {
|
||||
switch {
|
||||
case errors.Is(err, services.ErrProjectNotFound):
|
||||
c.JSON(http.StatusNotFound, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusNotFound, "resource not found", err)
|
||||
case errors.Is(err, services.ErrProjectForbidden):
|
||||
c.JSON(http.StatusForbidden, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusForbidden, "access denied", err)
|
||||
default:
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
}
|
||||
return
|
||||
}
|
||||
@@ -1496,20 +1533,22 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
projects.PUT("/:uuid", func(c *gin.Context) {
|
||||
var req services.UpdateProjectRequest
|
||||
if err := c.ShouldBindJSON(&req); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
project, err := projectService.Update(c.Param("uuid"), dbUsername, &req)
|
||||
if err != nil {
|
||||
switch {
|
||||
case errors.Is(err, services.ErrReservedMainVariant):
|
||||
respondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
case errors.Is(err, services.ErrProjectCodeExists):
|
||||
c.JSON(http.StatusConflict, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusConflict, "conflict detected", err)
|
||||
case errors.Is(err, services.ErrProjectNotFound):
|
||||
c.JSON(http.StatusNotFound, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusNotFound, "resource not found", err)
|
||||
case errors.Is(err, services.ErrProjectForbidden):
|
||||
c.JSON(http.StatusForbidden, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusForbidden, "access denied", err)
|
||||
default:
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
}
|
||||
return
|
||||
}
|
||||
@@ -1520,11 +1559,11 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
if err := projectService.Archive(c.Param("uuid"), dbUsername); err != nil {
|
||||
switch {
|
||||
case errors.Is(err, services.ErrProjectNotFound):
|
||||
c.JSON(http.StatusNotFound, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusNotFound, "resource not found", err)
|
||||
case errors.Is(err, services.ErrProjectForbidden):
|
||||
c.JSON(http.StatusForbidden, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusForbidden, "access denied", err)
|
||||
default:
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
}
|
||||
return
|
||||
}
|
||||
@@ -1535,11 +1574,11 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
if err := projectService.Reactivate(c.Param("uuid"), dbUsername); err != nil {
|
||||
switch {
|
||||
case errors.Is(err, services.ErrProjectNotFound):
|
||||
c.JSON(http.StatusNotFound, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusNotFound, "resource not found", err)
|
||||
case errors.Is(err, services.ErrProjectForbidden):
|
||||
c.JSON(http.StatusForbidden, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusForbidden, "access denied", err)
|
||||
default:
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
}
|
||||
return
|
||||
}
|
||||
@@ -1550,13 +1589,13 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
if err := projectService.DeleteVariant(c.Param("uuid"), dbUsername); err != nil {
|
||||
switch {
|
||||
case errors.Is(err, services.ErrCannotDeleteMainVariant):
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
case errors.Is(err, services.ErrProjectNotFound):
|
||||
c.JSON(http.StatusNotFound, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusNotFound, "resource not found", err)
|
||||
case errors.Is(err, services.ErrProjectForbidden):
|
||||
c.JSON(http.StatusForbidden, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusForbidden, "access denied", err)
|
||||
default:
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
}
|
||||
return
|
||||
}
|
||||
@@ -1576,11 +1615,11 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
if err != nil {
|
||||
switch {
|
||||
case errors.Is(err, services.ErrProjectNotFound):
|
||||
c.JSON(http.StatusNotFound, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusNotFound, "resource not found", err)
|
||||
case errors.Is(err, services.ErrProjectForbidden):
|
||||
c.JSON(http.StatusForbidden, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusForbidden, "access denied", err)
|
||||
default:
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
}
|
||||
return
|
||||
}
|
||||
@@ -1593,7 +1632,7 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
OrderedUUIDs []string `json:"ordered_uuids"`
|
||||
}
|
||||
if err := c.ShouldBindJSON(&req); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
if len(req.OrderedUUIDs) == 0 {
|
||||
@@ -1605,9 +1644,9 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
if err != nil {
|
||||
switch {
|
||||
case errors.Is(err, services.ErrProjectNotFound):
|
||||
c.JSON(http.StatusNotFound, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusNotFound, "resource not found", err)
|
||||
default:
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
}
|
||||
return
|
||||
}
|
||||
@@ -1628,7 +1667,7 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
projects.POST("/:uuid/configs", func(c *gin.Context) {
|
||||
var req services.CreateConfigRequest
|
||||
if err := c.ShouldBindJSON(&req); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
projectUUID := c.Param("uuid")
|
||||
@@ -1636,29 +1675,42 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
|
||||
config, err := configService.Create(dbUsername, &req)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
c.JSON(http.StatusCreated, config)
|
||||
})
|
||||
|
||||
projects.POST("/:uuid/vendor-import", func(c *gin.Context) {
|
||||
c.Request.Body = http.MaxBytesReader(c.Writer, c.Request.Body, vendorImportBodyLimit())
|
||||
fileHeader, err := c.FormFile("file")
|
||||
if err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": "file is required"})
|
||||
if isVendorImportTooLarge(0, err) {
|
||||
respondError(c, http.StatusBadRequest, "vendor workspace file exceeds 1 GiB limit", errVendorImportTooLarge)
|
||||
return
|
||||
}
|
||||
respondError(c, http.StatusBadRequest, "file is required", err)
|
||||
return
|
||||
}
|
||||
if isVendorImportTooLarge(fileHeader.Size, nil) {
|
||||
respondError(c, http.StatusBadRequest, "vendor workspace file exceeds 1 GiB limit", errVendorImportTooLarge)
|
||||
return
|
||||
}
|
||||
|
||||
file, err := fileHeader.Open()
|
||||
if err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": "failed to open uploaded file"})
|
||||
respondError(c, http.StatusBadRequest, "failed to open uploaded file", err)
|
||||
return
|
||||
}
|
||||
defer file.Close()
|
||||
|
||||
data, err := io.ReadAll(file)
|
||||
data, err := io.ReadAll(io.LimitReader(file, vendorImportMaxBytes+1))
|
||||
if err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": "failed to read uploaded file"})
|
||||
respondError(c, http.StatusBadRequest, "failed to read uploaded file", err)
|
||||
return
|
||||
}
|
||||
if int64(len(data)) > vendorImportMaxBytes {
|
||||
respondError(c, http.StatusBadRequest, "vendor workspace file exceeds 1 GiB limit", errVendorImportTooLarge)
|
||||
return
|
||||
}
|
||||
if !services.IsCFXMLWorkspace(data) {
|
||||
@@ -1670,9 +1722,9 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
if err != nil {
|
||||
switch {
|
||||
case errors.Is(err, services.ErrProjectNotFound):
|
||||
c.JSON(http.StatusNotFound, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusNotFound, "resource not found", err)
|
||||
default:
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
}
|
||||
return
|
||||
}
|
||||
@@ -1688,14 +1740,14 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
|
||||
Name string `json:"name"`
|
||||
}
|
||||
if err := c.ShouldBindJSON(&req); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
|
||||
projectUUID := c.Param("uuid")
|
||||
config, err := configService.CloneNoAuthToProject(c.Param("config_uuid"), req.Name, dbUsername, &projectUUID)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
respondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
c.JSON(http.StatusCreated, config)
|
||||
@@ -1769,22 +1821,12 @@ func requestLogger() gin.HandlerFunc {
|
||||
path := c.Request.URL.Path
|
||||
query := c.Request.URL.RawQuery
|
||||
|
||||
blw := &captureResponseWriter{
|
||||
ResponseWriter: c.Writer,
|
||||
body: bytes.NewBuffer(nil),
|
||||
}
|
||||
c.Writer = blw
|
||||
|
||||
c.Next()
|
||||
|
||||
latency := time.Since(start)
|
||||
status := c.Writer.Status()
|
||||
|
||||
if status >= http.StatusBadRequest {
|
||||
responseBody := strings.TrimSpace(blw.body.String())
|
||||
if len(responseBody) > 2048 {
|
||||
responseBody = responseBody[:2048] + "...(truncated)"
|
||||
}
|
||||
errText := strings.TrimSpace(c.Errors.String())
|
||||
|
||||
slog.Error("request failed",
|
||||
@@ -1795,7 +1837,6 @@ func requestLogger() gin.HandlerFunc {
|
||||
"latency", latency,
|
||||
"ip", c.ClientIP(),
|
||||
"errors", errText,
|
||||
"response", responseBody,
|
||||
)
|
||||
return
|
||||
}
|
||||
@@ -1810,22 +1851,3 @@ func requestLogger() gin.HandlerFunc {
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
type captureResponseWriter struct {
|
||||
gin.ResponseWriter
|
||||
body *bytes.Buffer
|
||||
}
|
||||
|
||||
func (w *captureResponseWriter) Write(b []byte) (int, error) {
|
||||
if len(b) > 0 {
|
||||
_, _ = w.body.Write(b)
|
||||
}
|
||||
return w.ResponseWriter.Write(b)
|
||||
}
|
||||
|
||||
func (w *captureResponseWriter) WriteString(s string) (int, error) {
|
||||
if s != "" {
|
||||
_, _ = w.body.WriteString(s)
|
||||
}
|
||||
return w.ResponseWriter.WriteString(s)
|
||||
}
|
||||
|
||||
48
cmd/qfs/request_logger_test.go
Normal file
48
cmd/qfs/request_logger_test.go
Normal file
@@ -0,0 +1,48 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"errors"
|
||||
"log/slog"
|
||||
"net/http"
|
||||
"net/http/httptest"
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/gin-gonic/gin"
|
||||
)
|
||||
|
||||
func TestRequestLoggerDoesNotLogResponseBody(t *testing.T) {
|
||||
gin.SetMode(gin.TestMode)
|
||||
|
||||
var logBuffer bytes.Buffer
|
||||
previousLogger := slog.Default()
|
||||
slog.SetDefault(slog.New(slog.NewTextHandler(&logBuffer, &slog.HandlerOptions{})))
|
||||
defer slog.SetDefault(previousLogger)
|
||||
|
||||
router := gin.New()
|
||||
router.Use(requestLogger())
|
||||
router.GET("/fail", func(c *gin.Context) {
|
||||
_ = c.Error(errors.New("root cause"))
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": "do not log this body"})
|
||||
})
|
||||
|
||||
rec := httptest.NewRecorder()
|
||||
req := httptest.NewRequest(http.MethodGet, "/fail?debug=1", nil)
|
||||
router.ServeHTTP(rec, req)
|
||||
|
||||
if rec.Code != http.StatusBadRequest {
|
||||
t.Fatalf("expected 400, got %d", rec.Code)
|
||||
}
|
||||
|
||||
logOutput := logBuffer.String()
|
||||
if !strings.Contains(logOutput, "request failed") {
|
||||
t.Fatalf("expected request failure log, got %q", logOutput)
|
||||
}
|
||||
if strings.Contains(logOutput, "do not log this body") {
|
||||
t.Fatalf("response body leaked into logs: %q", logOutput)
|
||||
}
|
||||
if !strings.Contains(logOutput, "root cause") {
|
||||
t.Fatalf("expected error details in logs, got %q", logOutput)
|
||||
}
|
||||
}
|
||||
@@ -3,10 +3,12 @@ package main
|
||||
import (
|
||||
"bytes"
|
||||
"encoding/json"
|
||||
"mime/multipart"
|
||||
"net/http"
|
||||
"net/http/httptest"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"git.mchus.pro/mchus/quoteforge/internal/config"
|
||||
@@ -290,6 +292,88 @@ func TestConfigMoveToProjectEndpoint(t *testing.T) {
|
||||
}
|
||||
}
|
||||
|
||||
func TestVendorImportRejectsOversizedUpload(t *testing.T) {
|
||||
moveToRepoRoot(t)
|
||||
|
||||
prevLimit := vendorImportMaxBytes
|
||||
vendorImportMaxBytes = 128
|
||||
defer func() { vendorImportMaxBytes = prevLimit }()
|
||||
|
||||
local, connMgr, _ := newAPITestStack(t)
|
||||
cfg := &config.Config{}
|
||||
setConfigDefaults(cfg)
|
||||
router, _, err := setupRouter(cfg, local, connMgr, "tester", nil)
|
||||
if err != nil {
|
||||
t.Fatalf("setup router: %v", err)
|
||||
}
|
||||
|
||||
createProjectReq := httptest.NewRequest(http.MethodPost, "/api/projects", bytes.NewReader([]byte(`{"name":"Import Project","code":"IMP"}`)))
|
||||
createProjectReq.Header.Set("Content-Type", "application/json")
|
||||
createProjectRec := httptest.NewRecorder()
|
||||
router.ServeHTTP(createProjectRec, createProjectReq)
|
||||
if createProjectRec.Code != http.StatusCreated {
|
||||
t.Fatalf("create project status=%d body=%s", createProjectRec.Code, createProjectRec.Body.String())
|
||||
}
|
||||
|
||||
var project models.Project
|
||||
if err := json.Unmarshal(createProjectRec.Body.Bytes(), &project); err != nil {
|
||||
t.Fatalf("unmarshal project: %v", err)
|
||||
}
|
||||
|
||||
var body bytes.Buffer
|
||||
writer := multipart.NewWriter(&body)
|
||||
part, err := writer.CreateFormFile("file", "huge.xml")
|
||||
if err != nil {
|
||||
t.Fatalf("create form file: %v", err)
|
||||
}
|
||||
payload := "<CFXML>" + strings.Repeat("A", int(vendorImportMaxBytes)+1) + "</CFXML>"
|
||||
if _, err := part.Write([]byte(payload)); err != nil {
|
||||
t.Fatalf("write multipart payload: %v", err)
|
||||
}
|
||||
if err := writer.Close(); err != nil {
|
||||
t.Fatalf("close multipart writer: %v", err)
|
||||
}
|
||||
|
||||
req := httptest.NewRequest(http.MethodPost, "/api/projects/"+project.UUID+"/vendor-import", &body)
|
||||
req.Header.Set("Content-Type", writer.FormDataContentType())
|
||||
rec := httptest.NewRecorder()
|
||||
router.ServeHTTP(rec, req)
|
||||
|
||||
if rec.Code != http.StatusBadRequest {
|
||||
t.Fatalf("expected 400 for oversized upload, got %d body=%s", rec.Code, rec.Body.String())
|
||||
}
|
||||
if !strings.Contains(rec.Body.String(), "1 GiB") {
|
||||
t.Fatalf("expected size limit message, got %s", rec.Body.String())
|
||||
}
|
||||
}
|
||||
|
||||
func TestCreateConfigMalformedJSONReturnsGenericError(t *testing.T) {
|
||||
moveToRepoRoot(t)
|
||||
|
||||
local, connMgr, _ := newAPITestStack(t)
|
||||
cfg := &config.Config{}
|
||||
setConfigDefaults(cfg)
|
||||
router, _, err := setupRouter(cfg, local, connMgr, "tester", nil)
|
||||
if err != nil {
|
||||
t.Fatalf("setup router: %v", err)
|
||||
}
|
||||
|
||||
req := httptest.NewRequest(http.MethodPost, "/api/configs", bytes.NewReader([]byte(`{"name":`)))
|
||||
req.Header.Set("Content-Type", "application/json")
|
||||
rec := httptest.NewRecorder()
|
||||
router.ServeHTTP(rec, req)
|
||||
|
||||
if rec.Code != http.StatusBadRequest {
|
||||
t.Fatalf("expected 400 for malformed json, got %d body=%s", rec.Code, rec.Body.String())
|
||||
}
|
||||
if strings.Contains(strings.ToLower(rec.Body.String()), "unexpected eof") {
|
||||
t.Fatalf("expected sanitized error body, got %s", rec.Body.String())
|
||||
}
|
||||
if !strings.Contains(rec.Body.String(), "invalid request") {
|
||||
t.Fatalf("expected generic invalid request message, got %s", rec.Body.String())
|
||||
}
|
||||
}
|
||||
|
||||
func newAPITestStack(t *testing.T) (*localdb.LocalDB, *db.ConnectionManager, *services.LocalConfigurationService) {
|
||||
t.Helper()
|
||||
|
||||
|
||||
@@ -1,56 +1,18 @@
|
||||
# QuoteForge Configuration
|
||||
# Copy this file to config.yaml and update values
|
||||
# QuoteForge runtime config
|
||||
# Runtime creates a minimal config automatically on first start.
|
||||
# This file is only a reference template.
|
||||
|
||||
server:
|
||||
host: "127.0.0.1" # Use 0.0.0.0 to listen on all interfaces
|
||||
host: "127.0.0.1" # Loopback only; remote HTTP binding is unsupported
|
||||
port: 8080
|
||||
mode: "release" # debug | release
|
||||
read_timeout: "30s"
|
||||
write_timeout: "30s"
|
||||
|
||||
database:
|
||||
host: "localhost"
|
||||
port: 3306
|
||||
name: "RFQ_LOG"
|
||||
user: "quoteforge"
|
||||
password: "CHANGE_ME"
|
||||
max_open_conns: 25
|
||||
max_idle_conns: 5
|
||||
conn_max_lifetime: "5m"
|
||||
|
||||
pricing:
|
||||
default_method: "weighted_median" # median | average | weighted_median
|
||||
default_period_days: 90
|
||||
freshness_green_days: 30
|
||||
freshness_yellow_days: 60
|
||||
freshness_red_days: 90
|
||||
min_quotes_for_median: 3
|
||||
popularity_decay_days: 180
|
||||
|
||||
export:
|
||||
temp_dir: "/tmp/quoteforge-exports"
|
||||
max_file_age: "1h"
|
||||
company_name: "Your Company Name"
|
||||
|
||||
backup:
|
||||
time: "00:00"
|
||||
|
||||
alerts:
|
||||
enabled: true
|
||||
check_interval: "1h"
|
||||
high_demand_threshold: 5 # КП за 30 дней
|
||||
trending_threshold_percent: 50 # % роста для алерта
|
||||
|
||||
notifications:
|
||||
email_enabled: false
|
||||
smtp_host: "smtp.example.com"
|
||||
smtp_port: 587
|
||||
smtp_user: ""
|
||||
smtp_password: ""
|
||||
from_address: "quoteforge@example.com"
|
||||
|
||||
logging:
|
||||
level: "info" # debug | info | warn | error
|
||||
format: "json" # json | text
|
||||
output: "stdout" # stdout | file
|
||||
file_path: "/var/log/quoteforge/app.log"
|
||||
format: "json" # json | text
|
||||
output: "stdout" # stdout | stderr | /path/to/file
|
||||
|
||||
@@ -10,6 +10,10 @@ import (
|
||||
"sort"
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"github.com/glebarez/sqlite"
|
||||
"gorm.io/gorm"
|
||||
"gorm.io/gorm/logger"
|
||||
)
|
||||
|
||||
type backupPeriod struct {
|
||||
@@ -250,6 +254,12 @@ func pruneOldBackups(periodDir string, keep int) error {
|
||||
}
|
||||
|
||||
func createBackupArchive(destPath, dbPath, configPath string) error {
|
||||
snapshotPath, cleanup, err := createSQLiteSnapshot(dbPath)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
defer cleanup()
|
||||
|
||||
file, err := os.Create(destPath)
|
||||
if err != nil {
|
||||
return err
|
||||
@@ -257,12 +267,10 @@ func createBackupArchive(destPath, dbPath, configPath string) error {
|
||||
defer file.Close()
|
||||
|
||||
zipWriter := zip.NewWriter(file)
|
||||
if err := addZipFile(zipWriter, dbPath); err != nil {
|
||||
if err := addZipFileAs(zipWriter, snapshotPath, filepath.Base(dbPath)); err != nil {
|
||||
_ = zipWriter.Close()
|
||||
return err
|
||||
}
|
||||
_ = addZipOptionalFile(zipWriter, dbPath+"-wal")
|
||||
_ = addZipOptionalFile(zipWriter, dbPath+"-shm")
|
||||
|
||||
if strings.TrimSpace(configPath) != "" {
|
||||
_ = addZipOptionalFile(zipWriter, configPath)
|
||||
@@ -274,6 +282,77 @@ func createBackupArchive(destPath, dbPath, configPath string) error {
|
||||
return file.Sync()
|
||||
}
|
||||
|
||||
func createSQLiteSnapshot(dbPath string) (string, func(), error) {
|
||||
tempFile, err := os.CreateTemp("", "qfs-backup-*.db")
|
||||
if err != nil {
|
||||
return "", func() {}, err
|
||||
}
|
||||
tempPath := tempFile.Name()
|
||||
if err := tempFile.Close(); err != nil {
|
||||
_ = os.Remove(tempPath)
|
||||
return "", func() {}, err
|
||||
}
|
||||
if err := os.Remove(tempPath); err != nil && !os.IsNotExist(err) {
|
||||
return "", func() {}, err
|
||||
}
|
||||
|
||||
cleanup := func() {
|
||||
_ = os.Remove(tempPath)
|
||||
}
|
||||
|
||||
db, err := gorm.Open(sqlite.Open(dbPath), &gorm.Config{
|
||||
Logger: logger.Default.LogMode(logger.Silent),
|
||||
})
|
||||
if err != nil {
|
||||
cleanup()
|
||||
return "", func() {}, err
|
||||
}
|
||||
|
||||
sqlDB, err := db.DB()
|
||||
if err != nil {
|
||||
cleanup()
|
||||
return "", func() {}, err
|
||||
}
|
||||
defer sqlDB.Close()
|
||||
|
||||
if err := db.Exec("PRAGMA busy_timeout = 5000").Error; err != nil {
|
||||
cleanup()
|
||||
return "", func() {}, fmt.Errorf("configure sqlite busy_timeout: %w", err)
|
||||
}
|
||||
|
||||
literalPath := strings.ReplaceAll(tempPath, "'", "''")
|
||||
if err := vacuumIntoWithRetry(db, literalPath); err != nil {
|
||||
cleanup()
|
||||
return "", func() {}, err
|
||||
}
|
||||
|
||||
return tempPath, cleanup, nil
|
||||
}
|
||||
|
||||
func vacuumIntoWithRetry(db *gorm.DB, literalPath string) error {
|
||||
var lastErr error
|
||||
for attempt := 0; attempt < 3; attempt++ {
|
||||
if err := db.Exec("VACUUM INTO '" + literalPath + "'").Error; err != nil {
|
||||
lastErr = err
|
||||
if !isSQLiteBusyError(err) {
|
||||
return fmt.Errorf("create sqlite snapshot: %w", err)
|
||||
}
|
||||
time.Sleep(time.Duration(attempt+1) * 250 * time.Millisecond)
|
||||
continue
|
||||
}
|
||||
return nil
|
||||
}
|
||||
return fmt.Errorf("create sqlite snapshot after retries: %w", lastErr)
|
||||
}
|
||||
|
||||
func isSQLiteBusyError(err error) bool {
|
||||
if err == nil {
|
||||
return false
|
||||
}
|
||||
lower := strings.ToLower(err.Error())
|
||||
return strings.Contains(lower, "database is locked") || strings.Contains(lower, "database is busy")
|
||||
}
|
||||
|
||||
func addZipOptionalFile(writer *zip.Writer, path string) error {
|
||||
if _, err := os.Stat(path); err != nil {
|
||||
return nil
|
||||
@@ -282,6 +361,10 @@ func addZipOptionalFile(writer *zip.Writer, path string) error {
|
||||
}
|
||||
|
||||
func addZipFile(writer *zip.Writer, path string) error {
|
||||
return addZipFileAs(writer, path, filepath.Base(path))
|
||||
}
|
||||
|
||||
func addZipFileAs(writer *zip.Writer, path string, archiveName string) error {
|
||||
in, err := os.Open(path)
|
||||
if err != nil {
|
||||
return err
|
||||
@@ -297,7 +380,7 @@ func addZipFile(writer *zip.Writer, path string) error {
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
header.Name = filepath.Base(path)
|
||||
header.Name = archiveName
|
||||
header.Method = zip.Deflate
|
||||
|
||||
out, err := writer.CreateHeader(header)
|
||||
|
||||
@@ -1,11 +1,15 @@
|
||||
package appstate
|
||||
|
||||
import (
|
||||
"archive/zip"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
"testing"
|
||||
"time"
|
||||
|
||||
"github.com/glebarez/sqlite"
|
||||
"gorm.io/gorm"
|
||||
)
|
||||
|
||||
func TestEnsureRotatingLocalBackupCreatesAndRotates(t *testing.T) {
|
||||
@@ -13,8 +17,8 @@ func TestEnsureRotatingLocalBackupCreatesAndRotates(t *testing.T) {
|
||||
dbPath := filepath.Join(temp, "qfs.db")
|
||||
cfgPath := filepath.Join(temp, "config.yaml")
|
||||
|
||||
if err := os.WriteFile(dbPath, []byte("db"), 0644); err != nil {
|
||||
t.Fatalf("write db: %v", err)
|
||||
if err := writeTestSQLiteDB(dbPath); err != nil {
|
||||
t.Fatalf("write sqlite db: %v", err)
|
||||
}
|
||||
if err := os.WriteFile(cfgPath, []byte("cfg"), 0644); err != nil {
|
||||
t.Fatalf("write config: %v", err)
|
||||
@@ -36,6 +40,7 @@ func TestEnsureRotatingLocalBackupCreatesAndRotates(t *testing.T) {
|
||||
if _, err := os.Stat(dailyArchive); err != nil {
|
||||
t.Fatalf("daily archive missing: %v", err)
|
||||
}
|
||||
assertZipContains(t, dailyArchive, "qfs.db", "config.yaml")
|
||||
|
||||
backupNow = func() time.Time { return time.Date(2026, 2, 12, 10, 0, 0, 0, time.UTC) }
|
||||
created, err = EnsureRotatingLocalBackup(dbPath, cfgPath)
|
||||
@@ -57,8 +62,8 @@ func TestEnsureRotatingLocalBackupEnvControls(t *testing.T) {
|
||||
dbPath := filepath.Join(temp, "qfs.db")
|
||||
cfgPath := filepath.Join(temp, "config.yaml")
|
||||
|
||||
if err := os.WriteFile(dbPath, []byte("db"), 0644); err != nil {
|
||||
t.Fatalf("write db: %v", err)
|
||||
if err := writeTestSQLiteDB(dbPath); err != nil {
|
||||
t.Fatalf("write sqlite db: %v", err)
|
||||
}
|
||||
if err := os.WriteFile(cfgPath, []byte("cfg"), 0644); err != nil {
|
||||
t.Fatalf("write config: %v", err)
|
||||
@@ -95,8 +100,8 @@ func TestEnsureRotatingLocalBackupRejectsGitWorktree(t *testing.T) {
|
||||
if err := os.MkdirAll(filepath.Dir(dbPath), 0755); err != nil {
|
||||
t.Fatalf("mkdir data dir: %v", err)
|
||||
}
|
||||
if err := os.WriteFile(dbPath, []byte("db"), 0644); err != nil {
|
||||
t.Fatalf("write db: %v", err)
|
||||
if err := writeTestSQLiteDB(dbPath); err != nil {
|
||||
t.Fatalf("write sqlite db: %v", err)
|
||||
}
|
||||
if err := os.WriteFile(cfgPath, []byte("cfg"), 0644); err != nil {
|
||||
t.Fatalf("write cfg: %v", err)
|
||||
@@ -110,3 +115,43 @@ func TestEnsureRotatingLocalBackupRejectsGitWorktree(t *testing.T) {
|
||||
t.Fatalf("unexpected error: %v", err)
|
||||
}
|
||||
}
|
||||
|
||||
func writeTestSQLiteDB(path string) error {
|
||||
db, err := gorm.Open(sqlite.Open(path), &gorm.Config{})
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
sqlDB, err := db.DB()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
defer sqlDB.Close()
|
||||
|
||||
return db.Exec(`
|
||||
CREATE TABLE sample_items (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
name TEXT NOT NULL
|
||||
);
|
||||
INSERT INTO sample_items(name) VALUES ('backup');
|
||||
`).Error
|
||||
}
|
||||
|
||||
func assertZipContains(t *testing.T, archivePath string, expected ...string) {
|
||||
t.Helper()
|
||||
|
||||
reader, err := zip.OpenReader(archivePath)
|
||||
if err != nil {
|
||||
t.Fatalf("open archive: %v", err)
|
||||
}
|
||||
defer reader.Close()
|
||||
|
||||
found := make(map[string]bool, len(reader.File))
|
||||
for _, file := range reader.File {
|
||||
found[file.Name] = true
|
||||
}
|
||||
for _, name := range expected {
|
||||
if !found[name] {
|
||||
t.Fatalf("archive %s missing %s", archivePath, name)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -7,19 +7,14 @@ import (
|
||||
"strconv"
|
||||
"time"
|
||||
|
||||
mysqlDriver "github.com/go-sql-driver/mysql"
|
||||
"gopkg.in/yaml.v3"
|
||||
)
|
||||
|
||||
type Config struct {
|
||||
Server ServerConfig `yaml:"server"`
|
||||
Database DatabaseConfig `yaml:"database"`
|
||||
Pricing PricingConfig `yaml:"pricing"`
|
||||
Export ExportConfig `yaml:"export"`
|
||||
Alerts AlertsConfig `yaml:"alerts"`
|
||||
Notifications NotificationsConfig `yaml:"notifications"`
|
||||
Logging LoggingConfig `yaml:"logging"`
|
||||
Backup BackupConfig `yaml:"backup"`
|
||||
Server ServerConfig `yaml:"server"`
|
||||
Export ExportConfig `yaml:"export"`
|
||||
Logging LoggingConfig `yaml:"logging"`
|
||||
Backup BackupConfig `yaml:"backup"`
|
||||
}
|
||||
|
||||
type ServerConfig struct {
|
||||
@@ -30,64 +25,6 @@ type ServerConfig struct {
|
||||
WriteTimeout time.Duration `yaml:"write_timeout"`
|
||||
}
|
||||
|
||||
type DatabaseConfig struct {
|
||||
Host string `yaml:"host"`
|
||||
Port int `yaml:"port"`
|
||||
Name string `yaml:"name"`
|
||||
User string `yaml:"user"`
|
||||
Password string `yaml:"password"`
|
||||
MaxOpenConns int `yaml:"max_open_conns"`
|
||||
MaxIdleConns int `yaml:"max_idle_conns"`
|
||||
ConnMaxLifetime time.Duration `yaml:"conn_max_lifetime"`
|
||||
}
|
||||
|
||||
func (d *DatabaseConfig) DSN() string {
|
||||
cfg := mysqlDriver.NewConfig()
|
||||
cfg.User = d.User
|
||||
cfg.Passwd = d.Password
|
||||
cfg.Net = "tcp"
|
||||
cfg.Addr = net.JoinHostPort(d.Host, strconv.Itoa(d.Port))
|
||||
cfg.DBName = d.Name
|
||||
cfg.ParseTime = true
|
||||
cfg.Loc = time.Local
|
||||
cfg.Params = map[string]string{
|
||||
"charset": "utf8mb4",
|
||||
}
|
||||
return cfg.FormatDSN()
|
||||
}
|
||||
|
||||
type PricingConfig struct {
|
||||
DefaultMethod string `yaml:"default_method"`
|
||||
DefaultPeriodDays int `yaml:"default_period_days"`
|
||||
FreshnessGreenDays int `yaml:"freshness_green_days"`
|
||||
FreshnessYellowDays int `yaml:"freshness_yellow_days"`
|
||||
FreshnessRedDays int `yaml:"freshness_red_days"`
|
||||
MinQuotesForMedian int `yaml:"min_quotes_for_median"`
|
||||
PopularityDecayDays int `yaml:"popularity_decay_days"`
|
||||
}
|
||||
|
||||
type ExportConfig struct {
|
||||
TempDir string `yaml:"temp_dir"`
|
||||
MaxFileAge time.Duration `yaml:"max_file_age"`
|
||||
CompanyName string `yaml:"company_name"`
|
||||
}
|
||||
|
||||
type AlertsConfig struct {
|
||||
Enabled bool `yaml:"enabled"`
|
||||
CheckInterval time.Duration `yaml:"check_interval"`
|
||||
HighDemandThreshold int `yaml:"high_demand_threshold"`
|
||||
TrendingThresholdPercent int `yaml:"trending_threshold_percent"`
|
||||
}
|
||||
|
||||
type NotificationsConfig struct {
|
||||
EmailEnabled bool `yaml:"email_enabled"`
|
||||
SMTPHost string `yaml:"smtp_host"`
|
||||
SMTPPort int `yaml:"smtp_port"`
|
||||
SMTPUser string `yaml:"smtp_user"`
|
||||
SMTPPassword string `yaml:"smtp_password"`
|
||||
FromAddress string `yaml:"from_address"`
|
||||
}
|
||||
|
||||
type LoggingConfig struct {
|
||||
Level string `yaml:"level"`
|
||||
Format string `yaml:"format"`
|
||||
@@ -95,6 +32,10 @@ type LoggingConfig struct {
|
||||
FilePath string `yaml:"file_path"`
|
||||
}
|
||||
|
||||
// ExportConfig is kept for constructor compatibility in export services.
|
||||
// Runtime no longer persists an export section in config.yaml.
|
||||
type ExportConfig struct{}
|
||||
|
||||
type BackupConfig struct {
|
||||
Time string `yaml:"time"`
|
||||
}
|
||||
@@ -132,38 +73,6 @@ func (c *Config) setDefaults() {
|
||||
c.Server.WriteTimeout = 30 * time.Second
|
||||
}
|
||||
|
||||
if c.Database.Port == 0 {
|
||||
c.Database.Port = 3306
|
||||
}
|
||||
if c.Database.MaxOpenConns == 0 {
|
||||
c.Database.MaxOpenConns = 25
|
||||
}
|
||||
if c.Database.MaxIdleConns == 0 {
|
||||
c.Database.MaxIdleConns = 5
|
||||
}
|
||||
if c.Database.ConnMaxLifetime == 0 {
|
||||
c.Database.ConnMaxLifetime = 5 * time.Minute
|
||||
}
|
||||
|
||||
if c.Pricing.DefaultMethod == "" {
|
||||
c.Pricing.DefaultMethod = "weighted_median"
|
||||
}
|
||||
if c.Pricing.DefaultPeriodDays == 0 {
|
||||
c.Pricing.DefaultPeriodDays = 90
|
||||
}
|
||||
if c.Pricing.FreshnessGreenDays == 0 {
|
||||
c.Pricing.FreshnessGreenDays = 30
|
||||
}
|
||||
if c.Pricing.FreshnessYellowDays == 0 {
|
||||
c.Pricing.FreshnessYellowDays = 60
|
||||
}
|
||||
if c.Pricing.FreshnessRedDays == 0 {
|
||||
c.Pricing.FreshnessRedDays = 90
|
||||
}
|
||||
if c.Pricing.MinQuotesForMedian == 0 {
|
||||
c.Pricing.MinQuotesForMedian = 3
|
||||
}
|
||||
|
||||
if c.Logging.Level == "" {
|
||||
c.Logging.Level = "info"
|
||||
}
|
||||
@@ -180,5 +89,5 @@ func (c *Config) setDefaults() {
|
||||
}
|
||||
|
||||
func (c *Config) Address() string {
|
||||
return fmt.Sprintf("%s:%d", c.Server.Host, c.Server.Port)
|
||||
return net.JoinHostPort(c.Server.Host, strconv.Itoa(c.Server.Port))
|
||||
}
|
||||
|
||||
@@ -49,7 +49,7 @@ func (h *ComponentHandler) List(c *gin.Context) {
|
||||
offset := (page - 1) * perPage
|
||||
localComps, total, err := h.localDB.ListComponents(localFilter, offset, perPage)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
|
||||
|
||||
@@ -58,7 +58,7 @@ type ProjectExportOptionsRequest struct {
|
||||
func (h *ExportHandler) ExportCSV(c *gin.Context) {
|
||||
var req ExportRequest
|
||||
if err := c.ShouldBindJSON(&req); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -150,7 +150,7 @@ func (h *ExportHandler) ExportConfigCSV(c *gin.Context) {
|
||||
// Get config before streaming (can return JSON error)
|
||||
config, err := h.configService.GetByUUID(uuid, h.dbUsername)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusNotFound, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusNotFound, "resource not found", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -193,13 +193,13 @@ func (h *ExportHandler) ExportProjectCSV(c *gin.Context) {
|
||||
|
||||
project, err := h.projectService.GetByUUID(projectUUID, h.dbUsername)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusNotFound, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusNotFound, "resource not found", err)
|
||||
return
|
||||
}
|
||||
|
||||
result, err := h.projectService.ListConfigurations(projectUUID, h.dbUsername, "active")
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -226,19 +226,19 @@ func (h *ExportHandler) ExportProjectPricingCSV(c *gin.Context) {
|
||||
|
||||
var req ProjectExportOptionsRequest
|
||||
if err := c.ShouldBindJSON(&req); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
|
||||
project, err := h.projectService.GetByUUID(projectUUID, h.dbUsername)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusNotFound, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusNotFound, "resource not found", err)
|
||||
return
|
||||
}
|
||||
|
||||
result, err := h.projectService.ListConfigurations(projectUUID, h.dbUsername, "active")
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
if len(result.Configs) == 0 {
|
||||
@@ -256,7 +256,7 @@ func (h *ExportHandler) ExportProjectPricingCSV(c *gin.Context) {
|
||||
|
||||
data, err := h.exportService.ProjectToPricingExportData(result.Configs, opts)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
|
||||
|
||||
@@ -25,7 +25,7 @@ func (h *PartnumberBooksHandler) List(c *gin.Context) {
|
||||
bookRepo := repository.NewPartnumberBookRepository(h.localDB.DB())
|
||||
books, err := bookRepo.ListBooks()
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -86,7 +86,7 @@ func (h *PartnumberBooksHandler) GetItems(c *gin.Context) {
|
||||
|
||||
items, total, err := bookRepo.GetBookItemsPage(book.ID, search, page, perPage)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
|
||||
|
||||
@@ -34,7 +34,7 @@ func (h *PricelistHandler) List(c *gin.Context) {
|
||||
|
||||
localPLs, err := h.localDB.GetLocalPricelists()
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
if source != "" {
|
||||
@@ -172,24 +172,48 @@ func (h *PricelistHandler) GetItems(c *gin.Context) {
|
||||
}
|
||||
var total int64
|
||||
if err := dbq.Count(&total).Error; err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
offset := (page - 1) * perPage
|
||||
|
||||
if err := dbq.Order("lot_name").Offset(offset).Limit(perPage).Find(&items).Error; err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
lotNames := make([]string, len(items))
|
||||
for i, item := range items {
|
||||
lotNames[i] = item.LotName
|
||||
}
|
||||
type compRow struct {
|
||||
LotName string
|
||||
LotDescription string
|
||||
}
|
||||
var comps []compRow
|
||||
if len(lotNames) > 0 {
|
||||
h.localDB.DB().Table("local_components").
|
||||
Select("lot_name, lot_description").
|
||||
Where("lot_name IN ?", lotNames).
|
||||
Scan(&comps)
|
||||
}
|
||||
descMap := make(map[string]string, len(comps))
|
||||
for _, c := range comps {
|
||||
descMap[c.LotName] = c.LotDescription
|
||||
}
|
||||
|
||||
resultItems := make([]gin.H, 0, len(items))
|
||||
for _, item := range items {
|
||||
resultItems = append(resultItems, gin.H{
|
||||
"id": item.ID,
|
||||
"lot_name": item.LotName,
|
||||
"price": item.Price,
|
||||
"category": item.LotCategory,
|
||||
"available_qty": item.AvailableQty,
|
||||
"partnumbers": []string(item.Partnumbers),
|
||||
"id": item.ID,
|
||||
"lot_name": item.LotName,
|
||||
"lot_description": descMap[item.LotName],
|
||||
"price": item.Price,
|
||||
"category": item.LotCategory,
|
||||
"available_qty": item.AvailableQty,
|
||||
"partnumbers": []string(item.Partnumbers),
|
||||
"partnumber_qtys": map[string]interface{}{},
|
||||
"competitor_names": []string{},
|
||||
"price_spread_pct": nil,
|
||||
})
|
||||
}
|
||||
|
||||
@@ -217,7 +241,7 @@ func (h *PricelistHandler) GetLotNames(c *gin.Context) {
|
||||
}
|
||||
items, err := h.localDB.GetLocalPricelistItems(localPL.ID)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
lotNames := make([]string, 0, len(items))
|
||||
|
||||
@@ -18,13 +18,13 @@ func NewQuoteHandler(quoteService *services.QuoteService) *QuoteHandler {
|
||||
func (h *QuoteHandler) Validate(c *gin.Context) {
|
||||
var req services.QuoteRequest
|
||||
if err := c.ShouldBindJSON(&req); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
|
||||
result, err := h.quoteService.ValidateAndCalculate(&req)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -34,13 +34,13 @@ func (h *QuoteHandler) Validate(c *gin.Context) {
|
||||
func (h *QuoteHandler) Calculate(c *gin.Context) {
|
||||
var req services.QuoteRequest
|
||||
if err := c.ShouldBindJSON(&req); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
|
||||
result, err := h.quoteService.ValidateAndCalculate(&req)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -53,13 +53,13 @@ func (h *QuoteHandler) Calculate(c *gin.Context) {
|
||||
func (h *QuoteHandler) PriceLevels(c *gin.Context) {
|
||||
var req services.PriceLevelsRequest
|
||||
if err := c.ShouldBindJSON(&req); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
|
||||
result, err := h.quoteService.CalculatePriceLevels(&req)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
|
||||
|
||||
73
internal/handlers/respond.go
Normal file
73
internal/handlers/respond.go
Normal file
@@ -0,0 +1,73 @@
|
||||
package handlers
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"errors"
|
||||
"io"
|
||||
"strings"
|
||||
|
||||
"github.com/gin-gonic/gin"
|
||||
)
|
||||
|
||||
func RespondError(c *gin.Context, status int, fallback string, err error) {
|
||||
if err != nil {
|
||||
_ = c.Error(err)
|
||||
}
|
||||
c.JSON(status, gin.H{"error": clientFacingErrorMessage(status, fallback, err)})
|
||||
}
|
||||
|
||||
func clientFacingErrorMessage(status int, fallback string, err error) string {
|
||||
if err == nil {
|
||||
return fallback
|
||||
}
|
||||
if status >= 500 {
|
||||
return fallback
|
||||
}
|
||||
if isRequestDecodeError(err) {
|
||||
return fallback
|
||||
}
|
||||
|
||||
message := strings.TrimSpace(err.Error())
|
||||
if message == "" {
|
||||
return fallback
|
||||
}
|
||||
if looksTechnicalError(message) {
|
||||
return fallback
|
||||
}
|
||||
return message
|
||||
}
|
||||
|
||||
func isRequestDecodeError(err error) bool {
|
||||
var syntaxErr *json.SyntaxError
|
||||
if errors.As(err, &syntaxErr) {
|
||||
return true
|
||||
}
|
||||
|
||||
var unmarshalTypeErr *json.UnmarshalTypeError
|
||||
if errors.As(err, &unmarshalTypeErr) {
|
||||
return true
|
||||
}
|
||||
|
||||
return errors.Is(err, io.ErrUnexpectedEOF) || errors.Is(err, io.EOF)
|
||||
}
|
||||
|
||||
func looksTechnicalError(message string) bool {
|
||||
lower := strings.ToLower(strings.TrimSpace(message))
|
||||
needles := []string{
|
||||
"sql",
|
||||
"gorm",
|
||||
"driver",
|
||||
"constraint",
|
||||
"syntax error",
|
||||
"unexpected eof",
|
||||
"record not found",
|
||||
"no such table",
|
||||
"stack trace",
|
||||
}
|
||||
for _, needle := range needles {
|
||||
if strings.Contains(lower, needle) {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
41
internal/handlers/respond_test.go
Normal file
41
internal/handlers/respond_test.go
Normal file
@@ -0,0 +1,41 @@
|
||||
package handlers
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestClientFacingErrorMessageKeepsDomain4xx(t *testing.T) {
|
||||
t.Parallel()
|
||||
|
||||
got := clientFacingErrorMessage(400, "invalid request", &json.SyntaxError{Offset: 1})
|
||||
if got != "invalid request" {
|
||||
t.Fatalf("expected fallback for decode error, got %q", got)
|
||||
}
|
||||
}
|
||||
|
||||
func TestClientFacingErrorMessagePreservesBusinessMessage(t *testing.T) {
|
||||
t.Parallel()
|
||||
|
||||
err := errString("main project variant cannot be deleted")
|
||||
got := clientFacingErrorMessage(400, "invalid request", err)
|
||||
if got != err.Error() {
|
||||
t.Fatalf("expected business message, got %q", got)
|
||||
}
|
||||
}
|
||||
|
||||
func TestClientFacingErrorMessageHidesTechnical4xx(t *testing.T) {
|
||||
t.Parallel()
|
||||
|
||||
err := errString("sql: no rows in result set")
|
||||
got := clientFacingErrorMessage(404, "resource not found", err)
|
||||
if got != "resource not found" {
|
||||
t.Fatalf("expected fallback for technical error, got %q", got)
|
||||
}
|
||||
}
|
||||
|
||||
type errString string
|
||||
|
||||
func (e errString) Error() string {
|
||||
return string(e)
|
||||
}
|
||||
@@ -1,6 +1,7 @@
|
||||
package handlers
|
||||
|
||||
import (
|
||||
"errors"
|
||||
"fmt"
|
||||
"html/template"
|
||||
"log/slog"
|
||||
@@ -12,8 +13,8 @@ import (
|
||||
qfassets "git.mchus.pro/mchus/quoteforge"
|
||||
"git.mchus.pro/mchus/quoteforge/internal/db"
|
||||
"git.mchus.pro/mchus/quoteforge/internal/localdb"
|
||||
mysqlDriver "github.com/go-sql-driver/mysql"
|
||||
"github.com/gin-gonic/gin"
|
||||
mysqlDriver "github.com/go-sql-driver/mysql"
|
||||
gormmysql "gorm.io/driver/mysql"
|
||||
"gorm.io/gorm"
|
||||
"gorm.io/gorm/logger"
|
||||
@@ -26,6 +27,8 @@ type SetupHandler struct {
|
||||
restartSig chan struct{}
|
||||
}
|
||||
|
||||
var errPermissionProbeRollback = errors.New("permission probe rollback")
|
||||
|
||||
func NewSetupHandler(localDB *localdb.LocalDB, connMgr *db.ConnectionManager, _ string, restartSig chan struct{}) (*SetupHandler, error) {
|
||||
funcMap := template.FuncMap{
|
||||
"sub": func(a, b int) int { return a - b },
|
||||
@@ -64,7 +67,8 @@ func (h *SetupHandler) ShowSetup(c *gin.Context) {
|
||||
|
||||
tmpl := h.templates["setup.html"]
|
||||
if err := tmpl.ExecuteTemplate(c.Writer, "setup.html", data); err != nil {
|
||||
c.String(http.StatusInternalServerError, "Template error: %v", err)
|
||||
_ = c.Error(err)
|
||||
c.String(http.StatusInternalServerError, "Template error")
|
||||
}
|
||||
}
|
||||
|
||||
@@ -89,49 +93,16 @@ func (h *SetupHandler) TestConnection(c *gin.Context) {
|
||||
}
|
||||
|
||||
dsn := buildMySQLDSN(host, port, database, user, password, 5*time.Second)
|
||||
|
||||
db, err := gorm.Open(gormmysql.Open(dsn), &gorm.Config{
|
||||
Logger: logger.Default.LogMode(logger.Silent),
|
||||
})
|
||||
lotCount, canWrite, err := validateMariaDBConnection(dsn)
|
||||
if err != nil {
|
||||
_ = c.Error(err)
|
||||
c.JSON(http.StatusOK, gin.H{
|
||||
"success": false,
|
||||
"error": fmt.Sprintf("Connection failed: %v", err),
|
||||
"error": "Connection check failed",
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
sqlDB, err := db.DB()
|
||||
if err != nil {
|
||||
c.JSON(http.StatusOK, gin.H{
|
||||
"success": false,
|
||||
"error": fmt.Sprintf("Failed to get database handle: %v", err),
|
||||
})
|
||||
return
|
||||
}
|
||||
defer sqlDB.Close()
|
||||
|
||||
if err := sqlDB.Ping(); err != nil {
|
||||
c.JSON(http.StatusOK, gin.H{
|
||||
"success": false,
|
||||
"error": fmt.Sprintf("Ping failed: %v", err),
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
// Check for required tables
|
||||
var lotCount int64
|
||||
if err := db.Table("lot").Count(&lotCount).Error; err != nil {
|
||||
c.JSON(http.StatusOK, gin.H{
|
||||
"success": false,
|
||||
"error": fmt.Sprintf("Table 'lot' not found or inaccessible: %v", err),
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
// Check write permission
|
||||
canWrite := testWritePermission(db)
|
||||
|
||||
c.JSON(http.StatusOK, gin.H{
|
||||
"success": true,
|
||||
"lot_count": lotCount,
|
||||
@@ -164,26 +135,21 @@ func (h *SetupHandler) SaveConnection(c *gin.Context) {
|
||||
|
||||
// Test connection first
|
||||
dsn := buildMySQLDSN(host, port, database, user, password, 5*time.Second)
|
||||
|
||||
db, err := gorm.Open(gormmysql.Open(dsn), &gorm.Config{
|
||||
Logger: logger.Default.LogMode(logger.Silent),
|
||||
})
|
||||
if err != nil {
|
||||
if _, _, err := validateMariaDBConnection(dsn); err != nil {
|
||||
_ = c.Error(err)
|
||||
c.JSON(http.StatusBadRequest, gin.H{
|
||||
"success": false,
|
||||
"error": fmt.Sprintf("Connection failed: %v", err),
|
||||
"error": "Connection check failed",
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
sqlDB, _ := db.DB()
|
||||
sqlDB.Close()
|
||||
|
||||
// Save settings
|
||||
if err := h.localDB.SaveSettings(host, port, database, user, password); err != nil {
|
||||
_ = c.Error(err)
|
||||
c.JSON(http.StatusInternalServerError, gin.H{
|
||||
"success": false,
|
||||
"error": fmt.Sprintf("Failed to save settings: %v", err),
|
||||
"error": "Failed to save settings",
|
||||
})
|
||||
return
|
||||
}
|
||||
@@ -232,22 +198,6 @@ func (h *SetupHandler) GetStatus(c *gin.Context) {
|
||||
})
|
||||
}
|
||||
|
||||
func testWritePermission(db *gorm.DB) bool {
|
||||
// Simple check: try to create a temporary table and drop it
|
||||
testTable := fmt.Sprintf("qt_write_test_%d", time.Now().UnixNano())
|
||||
|
||||
// Try to create a test table
|
||||
err := db.Exec(fmt.Sprintf("CREATE TABLE %s (id INT)", testTable)).Error
|
||||
if err != nil {
|
||||
return false
|
||||
}
|
||||
|
||||
// Drop it immediately
|
||||
db.Exec(fmt.Sprintf("DROP TABLE %s", testTable))
|
||||
|
||||
return true
|
||||
}
|
||||
|
||||
func buildMySQLDSN(host string, port int, database, user, password string, timeout time.Duration) string {
|
||||
cfg := mysqlDriver.NewConfig()
|
||||
cfg.User = user
|
||||
@@ -263,3 +213,47 @@ func buildMySQLDSN(host string, port int, database, user, password string, timeo
|
||||
}
|
||||
return cfg.FormatDSN()
|
||||
}
|
||||
|
||||
func validateMariaDBConnection(dsn string) (int64, bool, error) {
|
||||
db, err := gorm.Open(gormmysql.Open(dsn), &gorm.Config{
|
||||
Logger: logger.Default.LogMode(logger.Silent),
|
||||
})
|
||||
if err != nil {
|
||||
return 0, false, fmt.Errorf("open MariaDB connection: %w", err)
|
||||
}
|
||||
|
||||
sqlDB, err := db.DB()
|
||||
if err != nil {
|
||||
return 0, false, fmt.Errorf("get database handle: %w", err)
|
||||
}
|
||||
defer sqlDB.Close()
|
||||
|
||||
if err := sqlDB.Ping(); err != nil {
|
||||
return 0, false, fmt.Errorf("ping MariaDB: %w", err)
|
||||
}
|
||||
|
||||
var lotCount int64
|
||||
if err := db.Table("lot").Count(&lotCount).Error; err != nil {
|
||||
return 0, false, fmt.Errorf("check required table lot: %w", err)
|
||||
}
|
||||
|
||||
return lotCount, testSyncWritePermission(db), nil
|
||||
}
|
||||
|
||||
func testSyncWritePermission(db *gorm.DB) bool {
|
||||
sentinel := fmt.Sprintf("quoteforge-permission-check-%d", time.Now().UnixNano())
|
||||
err := db.Transaction(func(tx *gorm.DB) error {
|
||||
if err := tx.Exec(`
|
||||
INSERT INTO qt_client_schema_state (username, hostname, last_checked_at, updated_at)
|
||||
VALUES (?, ?, NOW(), NOW())
|
||||
ON DUPLICATE KEY UPDATE
|
||||
last_checked_at = VALUES(last_checked_at),
|
||||
updated_at = VALUES(updated_at)
|
||||
`, sentinel, "setup-check").Error; err != nil {
|
||||
return err
|
||||
}
|
||||
return errPermissionProbeRollback
|
||||
})
|
||||
|
||||
return errors.Is(err, errPermissionProbeRollback)
|
||||
}
|
||||
|
||||
@@ -116,9 +116,7 @@ func (h *SyncHandler) GetStatus(c *gin.Context) {
|
||||
func (h *SyncHandler) GetReadiness(c *gin.Context) {
|
||||
readiness, err := h.syncService.GetReadiness()
|
||||
if err != nil && readiness == nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{
|
||||
"error": err.Error(),
|
||||
})
|
||||
RespondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
if readiness == nil {
|
||||
@@ -158,8 +156,9 @@ func (h *SyncHandler) ensureSyncReadiness(c *gin.Context) bool {
|
||||
|
||||
c.JSON(http.StatusInternalServerError, gin.H{
|
||||
"success": false,
|
||||
"error": err.Error(),
|
||||
"error": "internal server error",
|
||||
})
|
||||
_ = c.Error(err)
|
||||
_ = readiness
|
||||
return false
|
||||
}
|
||||
@@ -184,8 +183,9 @@ func (h *SyncHandler) SyncComponents(c *gin.Context) {
|
||||
if err != nil {
|
||||
c.JSON(http.StatusServiceUnavailable, gin.H{
|
||||
"success": false,
|
||||
"error": "Database connection failed: " + err.Error(),
|
||||
"error": "database connection failed",
|
||||
})
|
||||
_ = c.Error(err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -194,8 +194,9 @@ func (h *SyncHandler) SyncComponents(c *gin.Context) {
|
||||
slog.Error("component sync failed", "error", err)
|
||||
c.JSON(http.StatusInternalServerError, gin.H{
|
||||
"success": false,
|
||||
"error": err.Error(),
|
||||
"error": "component sync failed",
|
||||
})
|
||||
_ = c.Error(err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -220,8 +221,9 @@ func (h *SyncHandler) SyncPricelists(c *gin.Context) {
|
||||
slog.Error("pricelist sync failed", "error", err)
|
||||
c.JSON(http.StatusInternalServerError, gin.H{
|
||||
"success": false,
|
||||
"error": err.Error(),
|
||||
"error": "pricelist sync failed",
|
||||
})
|
||||
_ = c.Error(err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -247,8 +249,9 @@ func (h *SyncHandler) SyncPartnumberBooks(c *gin.Context) {
|
||||
slog.Error("partnumber books pull failed", "error", err)
|
||||
c.JSON(http.StatusInternalServerError, gin.H{
|
||||
"success": false,
|
||||
"error": err.Error(),
|
||||
"error": "partnumber books sync failed",
|
||||
})
|
||||
_ = c.Error(err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -295,8 +298,9 @@ func (h *SyncHandler) SyncAll(c *gin.Context) {
|
||||
slog.Error("pending push failed during full sync", "error", err)
|
||||
c.JSON(http.StatusInternalServerError, gin.H{
|
||||
"success": false,
|
||||
"error": "Pending changes push failed: " + err.Error(),
|
||||
"error": "pending changes push failed",
|
||||
})
|
||||
_ = c.Error(err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -305,8 +309,9 @@ func (h *SyncHandler) SyncAll(c *gin.Context) {
|
||||
if err != nil {
|
||||
c.JSON(http.StatusServiceUnavailable, gin.H{
|
||||
"success": false,
|
||||
"error": "Database connection failed: " + err.Error(),
|
||||
"error": "database connection failed",
|
||||
})
|
||||
_ = c.Error(err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -315,8 +320,9 @@ func (h *SyncHandler) SyncAll(c *gin.Context) {
|
||||
slog.Error("component sync failed during full sync", "error", err)
|
||||
c.JSON(http.StatusInternalServerError, gin.H{
|
||||
"success": false,
|
||||
"error": "Component sync failed: " + err.Error(),
|
||||
"error": "component sync failed",
|
||||
})
|
||||
_ = c.Error(err)
|
||||
return
|
||||
}
|
||||
componentsSynced = compResult.TotalSynced
|
||||
@@ -327,10 +333,11 @@ func (h *SyncHandler) SyncAll(c *gin.Context) {
|
||||
slog.Error("pricelist sync failed during full sync", "error", err)
|
||||
c.JSON(http.StatusInternalServerError, gin.H{
|
||||
"success": false,
|
||||
"error": "Pricelist sync failed: " + err.Error(),
|
||||
"error": "pricelist sync failed",
|
||||
"pending_pushed": pendingPushed,
|
||||
"components_synced": componentsSynced,
|
||||
})
|
||||
_ = c.Error(err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -339,11 +346,12 @@ func (h *SyncHandler) SyncAll(c *gin.Context) {
|
||||
slog.Error("project import failed during full sync", "error", err)
|
||||
c.JSON(http.StatusInternalServerError, gin.H{
|
||||
"success": false,
|
||||
"error": "Project import failed: " + err.Error(),
|
||||
"error": "project import failed",
|
||||
"pending_pushed": pendingPushed,
|
||||
"components_synced": componentsSynced,
|
||||
"pricelists_synced": pricelistsSynced,
|
||||
})
|
||||
_ = c.Error(err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -352,7 +360,7 @@ func (h *SyncHandler) SyncAll(c *gin.Context) {
|
||||
slog.Error("configuration import failed during full sync", "error", err)
|
||||
c.JSON(http.StatusInternalServerError, gin.H{
|
||||
"success": false,
|
||||
"error": "Configuration import failed: " + err.Error(),
|
||||
"error": "configuration import failed",
|
||||
"pending_pushed": pendingPushed,
|
||||
"components_synced": componentsSynced,
|
||||
"pricelists_synced": pricelistsSynced,
|
||||
@@ -360,6 +368,7 @@ func (h *SyncHandler) SyncAll(c *gin.Context) {
|
||||
"projects_updated": projectsResult.Updated,
|
||||
"projects_skipped": projectsResult.Skipped,
|
||||
})
|
||||
_ = c.Error(err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -398,8 +407,9 @@ func (h *SyncHandler) PushPendingChanges(c *gin.Context) {
|
||||
slog.Error("push pending changes failed", "error", err)
|
||||
c.JSON(http.StatusInternalServerError, gin.H{
|
||||
"success": false,
|
||||
"error": err.Error(),
|
||||
"error": "pending changes push failed",
|
||||
})
|
||||
_ = c.Error(err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -426,9 +436,7 @@ func (h *SyncHandler) GetPendingCount(c *gin.Context) {
|
||||
func (h *SyncHandler) GetPendingChanges(c *gin.Context) {
|
||||
changes, err := h.localDB.GetPendingChanges()
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{
|
||||
"error": err.Error(),
|
||||
})
|
||||
RespondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -445,8 +453,9 @@ func (h *SyncHandler) RepairPendingChanges(c *gin.Context) {
|
||||
slog.Error("repair pending changes failed", "error", err)
|
||||
c.JSON(http.StatusInternalServerError, gin.H{
|
||||
"success": false,
|
||||
"error": err.Error(),
|
||||
"error": "pending changes repair failed",
|
||||
})
|
||||
_ = c.Error(err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -588,9 +597,7 @@ func (h *SyncHandler) GetUsersStatus(c *gin.Context) {
|
||||
|
||||
users, err := h.syncService.ListUserSyncStatuses(threshold)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{
|
||||
"error": err.Error(),
|
||||
})
|
||||
RespondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -639,7 +646,8 @@ func (h *SyncHandler) SyncStatusPartial(c *gin.Context) {
|
||||
c.Header("Content-Type", "text/html; charset=utf-8")
|
||||
if err := h.tmpl.ExecuteTemplate(c.Writer, "sync_status", data); err != nil {
|
||||
slog.Error("failed to render sync_status template", "error", err)
|
||||
c.String(http.StatusInternalServerError, "Template error: "+err.Error())
|
||||
_ = c.Error(err)
|
||||
c.String(http.StatusInternalServerError, "Template error")
|
||||
}
|
||||
}
|
||||
|
||||
@@ -675,7 +683,7 @@ func (h *SyncHandler) ReportPartnumberSeen(c *gin.Context) {
|
||||
} `json:"items"`
|
||||
}
|
||||
if err := c.ShouldBindJSON(&body); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -691,7 +699,7 @@ func (h *SyncHandler) ReportPartnumberSeen(c *gin.Context) {
|
||||
}
|
||||
|
||||
if err := h.syncService.PushPartnumberSeen(items); err != nil {
|
||||
c.JSON(http.StatusServiceUnavailable, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusServiceUnavailable, "service unavailable", err)
|
||||
return
|
||||
}
|
||||
|
||||
|
||||
@@ -62,7 +62,7 @@ func (h *VendorSpecHandler) PutVendorSpec(c *gin.Context) {
|
||||
VendorSpec []localdb.VendorSpecItem `json:"vendor_spec"`
|
||||
}
|
||||
if err := c.ShouldBindJSON(&body); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -82,11 +82,11 @@ func (h *VendorSpecHandler) PutVendorSpec(c *gin.Context) {
|
||||
spec := localdb.VendorSpec(body.VendorSpec)
|
||||
specJSON, err := json.Marshal(spec)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
if err := h.localDB.DB().Model(cfg).Update("vendor_spec", string(specJSON)).Error; err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -138,7 +138,7 @@ func (h *VendorSpecHandler) ResolveVendorSpec(c *gin.Context) {
|
||||
VendorSpec []localdb.VendorSpecItem `json:"vendor_spec"`
|
||||
}
|
||||
if err := c.ShouldBindJSON(&body); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -147,14 +147,14 @@ func (h *VendorSpecHandler) ResolveVendorSpec(c *gin.Context) {
|
||||
|
||||
resolved, err := resolver.Resolve(body.VendorSpec)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
|
||||
book, _ := bookRepo.GetActiveBook()
|
||||
aggregated, err := services.AggregateLOTs(resolved, book, bookRepo)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -181,7 +181,7 @@ func (h *VendorSpecHandler) ApplyVendorSpec(c *gin.Context) {
|
||||
} `json:"items"`
|
||||
}
|
||||
if err := c.ShouldBindJSON(&body); err != nil {
|
||||
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusBadRequest, "invalid request", err)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -196,12 +196,12 @@ func (h *VendorSpecHandler) ApplyVendorSpec(c *gin.Context) {
|
||||
|
||||
itemsJSON, err := json.Marshal(newItems)
|
||||
if err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
|
||||
if err := h.localDB.DB().Model(cfg).Update("items", string(itemsJSON)).Error; err != nil {
|
||||
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
|
||||
RespondError(c, http.StatusInternalServerError, "internal server error", err)
|
||||
return
|
||||
}
|
||||
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
package handlers
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"html/template"
|
||||
"strconv"
|
||||
"strings"
|
||||
@@ -114,12 +115,14 @@ func (h *WebHandler) render(c *gin.Context, name string, data gin.H) {
|
||||
c.Header("Content-Type", "text/html; charset=utf-8")
|
||||
tmpl, ok := h.templates[name]
|
||||
if !ok {
|
||||
c.String(500, "Template not found: %s", name)
|
||||
_ = c.Error(fmt.Errorf("template %q not found", name))
|
||||
c.String(500, "Template error")
|
||||
return
|
||||
}
|
||||
// Execute the page template which will use base
|
||||
if err := tmpl.ExecuteTemplate(c.Writer, name, data); err != nil {
|
||||
c.String(500, "Template error: %v", err)
|
||||
_ = c.Error(err)
|
||||
c.String(500, "Template error")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
47
internal/handlers/web_test.go
Normal file
47
internal/handlers/web_test.go
Normal file
@@ -0,0 +1,47 @@
|
||||
package handlers
|
||||
|
||||
import (
|
||||
"errors"
|
||||
"html/template"
|
||||
"net/http"
|
||||
"net/http/httptest"
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/gin-gonic/gin"
|
||||
)
|
||||
|
||||
func TestWebHandlerRenderHidesTemplateExecutionError(t *testing.T) {
|
||||
gin.SetMode(gin.TestMode)
|
||||
|
||||
tmpl := template.Must(template.New("broken.html").Funcs(template.FuncMap{
|
||||
"boom": func() (string, error) {
|
||||
return "", errors.New("secret template failure")
|
||||
},
|
||||
}).Parse(`{{define "broken.html"}}{{boom}}{{end}}`))
|
||||
|
||||
handler := &WebHandler{
|
||||
templates: map[string]*template.Template{
|
||||
"broken.html": tmpl,
|
||||
},
|
||||
}
|
||||
|
||||
rec := httptest.NewRecorder()
|
||||
ctx, _ := gin.CreateTestContext(rec)
|
||||
ctx.Request = httptest.NewRequest(http.MethodGet, "/broken", nil)
|
||||
|
||||
handler.render(ctx, "broken.html", gin.H{})
|
||||
|
||||
if rec.Code != http.StatusInternalServerError {
|
||||
t.Fatalf("expected 500, got %d", rec.Code)
|
||||
}
|
||||
if body := strings.TrimSpace(rec.Body.String()); body != "Template error" {
|
||||
t.Fatalf("expected generic template error, got %q", body)
|
||||
}
|
||||
if len(ctx.Errors) != 1 {
|
||||
t.Fatalf("expected logged template error, got %d", len(ctx.Errors))
|
||||
}
|
||||
if !strings.Contains(ctx.Errors.String(), "secret template failure") {
|
||||
t.Fatalf("expected original error in gin context, got %q", ctx.Errors.String())
|
||||
}
|
||||
}
|
||||
@@ -611,6 +611,46 @@ func (l *LocalDB) SaveProject(project *LocalProject) error {
|
||||
return l.db.Save(project).Error
|
||||
}
|
||||
|
||||
// SaveProjectPreservingUpdatedAt stores a project without replacing UpdatedAt
|
||||
// with the current local sync time.
|
||||
func (l *LocalDB) SaveProjectPreservingUpdatedAt(project *LocalProject) error {
|
||||
if project == nil {
|
||||
return fmt.Errorf("project is nil")
|
||||
}
|
||||
|
||||
if project.ID == 0 && strings.TrimSpace(project.UUID) != "" {
|
||||
var existing LocalProject
|
||||
err := l.db.Where("uuid = ?", project.UUID).First(&existing).Error
|
||||
if err == nil {
|
||||
project.ID = existing.ID
|
||||
} else if !errors.Is(err, gorm.ErrRecordNotFound) {
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
if project.ID == 0 {
|
||||
return l.db.Create(project).Error
|
||||
}
|
||||
|
||||
return l.db.Model(&LocalProject{}).
|
||||
Where("id = ?", project.ID).
|
||||
UpdateColumns(map[string]interface{}{
|
||||
"uuid": project.UUID,
|
||||
"server_id": project.ServerID,
|
||||
"owner_username": project.OwnerUsername,
|
||||
"code": project.Code,
|
||||
"variant": project.Variant,
|
||||
"name": project.Name,
|
||||
"tracker_url": project.TrackerURL,
|
||||
"is_active": project.IsActive,
|
||||
"is_system": project.IsSystem,
|
||||
"created_at": project.CreatedAt,
|
||||
"updated_at": project.UpdatedAt,
|
||||
"synced_at": project.SyncedAt,
|
||||
"sync_status": project.SyncStatus,
|
||||
}).Error
|
||||
}
|
||||
|
||||
func (l *LocalDB) GetProjects(ownerUsername string, includeArchived bool) ([]LocalProject, error) {
|
||||
var projects []LocalProject
|
||||
query := l.db.Model(&LocalProject{}).Where("owner_username = ?", ownerUsername)
|
||||
|
||||
53
internal/localdb/project_sync_timestamp_test.go
Normal file
53
internal/localdb/project_sync_timestamp_test.go
Normal file
@@ -0,0 +1,53 @@
|
||||
package localdb
|
||||
|
||||
import (
|
||||
"path/filepath"
|
||||
"testing"
|
||||
"time"
|
||||
)
|
||||
|
||||
func TestSaveProjectPreservingUpdatedAtKeepsProvidedTimestamp(t *testing.T) {
|
||||
dbPath := filepath.Join(t.TempDir(), "project_sync_timestamp.db")
|
||||
|
||||
local, err := New(dbPath)
|
||||
if err != nil {
|
||||
t.Fatalf("open localdb: %v", err)
|
||||
}
|
||||
t.Cleanup(func() { _ = local.Close() })
|
||||
|
||||
createdAt := time.Date(2026, 2, 1, 10, 0, 0, 0, time.UTC)
|
||||
updatedAt := time.Date(2026, 2, 3, 12, 30, 0, 0, time.UTC)
|
||||
project := &LocalProject{
|
||||
UUID: "project-1",
|
||||
OwnerUsername: "tester",
|
||||
Code: "OPS-1",
|
||||
Variant: "Lenovo",
|
||||
IsActive: true,
|
||||
CreatedAt: createdAt,
|
||||
UpdatedAt: updatedAt,
|
||||
SyncStatus: "synced",
|
||||
}
|
||||
|
||||
if err := local.SaveProjectPreservingUpdatedAt(project); err != nil {
|
||||
t.Fatalf("save project: %v", err)
|
||||
}
|
||||
|
||||
syncedAt := time.Date(2026, 3, 16, 8, 45, 0, 0, time.UTC)
|
||||
project.SyncedAt = &syncedAt
|
||||
project.SyncStatus = "synced"
|
||||
|
||||
if err := local.SaveProjectPreservingUpdatedAt(project); err != nil {
|
||||
t.Fatalf("save project second time: %v", err)
|
||||
}
|
||||
|
||||
stored, err := local.GetProjectByUUID(project.UUID)
|
||||
if err != nil {
|
||||
t.Fatalf("get project: %v", err)
|
||||
}
|
||||
if !stored.UpdatedAt.Equal(updatedAt) {
|
||||
t.Fatalf("updated_at changed during sync save: got %s want %s", stored.UpdatedAt, updatedAt)
|
||||
}
|
||||
if stored.SyncedAt == nil || !stored.SyncedAt.Equal(syncedAt) {
|
||||
t.Fatalf("synced_at not updated correctly: got %+v want %s", stored.SyncedAt, syncedAt)
|
||||
}
|
||||
}
|
||||
@@ -20,6 +20,7 @@ var (
|
||||
ErrProjectForbidden = errors.New("access to project forbidden")
|
||||
ErrProjectCodeExists = errors.New("project code and variant already exist")
|
||||
ErrCannotDeleteMainVariant = errors.New("cannot delete main variant")
|
||||
ErrReservedMainVariant = errors.New("variant name 'main' is reserved")
|
||||
)
|
||||
|
||||
type ProjectService struct {
|
||||
@@ -63,6 +64,9 @@ func (s *ProjectService) Create(ownerUsername string, req *CreateProjectRequest)
|
||||
return nil, fmt.Errorf("project code is required")
|
||||
}
|
||||
variant := strings.TrimSpace(req.Variant)
|
||||
if err := validateProjectVariantName(variant); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
if err := s.ensureUniqueProjectCodeVariant("", code, variant); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
@@ -105,6 +109,9 @@ func (s *ProjectService) Update(projectUUID, ownerUsername string, req *UpdatePr
|
||||
}
|
||||
if req.Variant != nil {
|
||||
localProject.Variant = strings.TrimSpace(*req.Variant)
|
||||
if err := validateProjectVariantName(localProject.Variant); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
}
|
||||
if err := s.ensureUniqueProjectCodeVariant(projectUUID, localProject.Code, localProject.Variant); err != nil {
|
||||
return nil, err
|
||||
@@ -166,6 +173,13 @@ func normalizeProjectVariant(variant string) string {
|
||||
return strings.ToLower(strings.TrimSpace(variant))
|
||||
}
|
||||
|
||||
func validateProjectVariantName(variant string) error {
|
||||
if normalizeProjectVariant(variant) == "main" {
|
||||
return ErrReservedMainVariant
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (s *ProjectService) Archive(projectUUID, ownerUsername string) error {
|
||||
return s.setProjectActive(projectUUID, ownerUsername, false)
|
||||
}
|
||||
|
||||
60
internal/services/project_test.go
Normal file
60
internal/services/project_test.go
Normal file
@@ -0,0 +1,60 @@
|
||||
package services
|
||||
|
||||
import (
|
||||
"errors"
|
||||
"path/filepath"
|
||||
"testing"
|
||||
|
||||
"git.mchus.pro/mchus/quoteforge/internal/localdb"
|
||||
)
|
||||
|
||||
func TestProjectServiceCreateRejectsReservedMainVariant(t *testing.T) {
|
||||
local, err := newProjectTestLocalDB(t)
|
||||
if err != nil {
|
||||
t.Fatalf("open localdb: %v", err)
|
||||
}
|
||||
service := NewProjectService(local)
|
||||
|
||||
_, err = service.Create("tester", &CreateProjectRequest{
|
||||
Code: "OPS-1",
|
||||
Variant: "main",
|
||||
})
|
||||
if !errors.Is(err, ErrReservedMainVariant) {
|
||||
t.Fatalf("expected ErrReservedMainVariant, got %v", err)
|
||||
}
|
||||
}
|
||||
|
||||
func TestProjectServiceUpdateRejectsReservedMainVariant(t *testing.T) {
|
||||
local, err := newProjectTestLocalDB(t)
|
||||
if err != nil {
|
||||
t.Fatalf("open localdb: %v", err)
|
||||
}
|
||||
service := NewProjectService(local)
|
||||
|
||||
created, err := service.Create("tester", &CreateProjectRequest{
|
||||
Code: "OPS-1",
|
||||
Variant: "Lenovo",
|
||||
})
|
||||
if err != nil {
|
||||
t.Fatalf("create project: %v", err)
|
||||
}
|
||||
|
||||
mainName := "main"
|
||||
_, err = service.Update(created.UUID, "tester", &UpdateProjectRequest{
|
||||
Variant: &mainName,
|
||||
})
|
||||
if !errors.Is(err, ErrReservedMainVariant) {
|
||||
t.Fatalf("expected ErrReservedMainVariant, got %v", err)
|
||||
}
|
||||
}
|
||||
|
||||
func newProjectTestLocalDB(t *testing.T) (*localdb.LocalDB, error) {
|
||||
t.Helper()
|
||||
dbPath := filepath.Join(t.TempDir(), "project_test.db")
|
||||
local, err := localdb.New(dbPath)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
t.Cleanup(func() { _ = local.Close() })
|
||||
return local, nil
|
||||
}
|
||||
@@ -215,7 +215,7 @@ func (s *Service) ImportProjectsToLocal() (*ProjectImportResult, error) {
|
||||
existing.SyncStatus = "synced"
|
||||
existing.SyncedAt = &now
|
||||
|
||||
if err := s.localDB.SaveProject(existing); err != nil {
|
||||
if err := s.localDB.SaveProjectPreservingUpdatedAt(existing); err != nil {
|
||||
return nil, fmt.Errorf("saving local project %s: %w", project.UUID, err)
|
||||
}
|
||||
result.Updated++
|
||||
@@ -225,7 +225,7 @@ func (s *Service) ImportProjectsToLocal() (*ProjectImportResult, error) {
|
||||
localProject := localdb.ProjectToLocal(&project)
|
||||
localProject.SyncStatus = "synced"
|
||||
localProject.SyncedAt = &now
|
||||
if err := s.localDB.SaveProject(localProject); err != nil {
|
||||
if err := s.localDB.SaveProjectPreservingUpdatedAt(localProject); err != nil {
|
||||
return nil, fmt.Errorf("saving local project %s: %w", project.UUID, err)
|
||||
}
|
||||
result.Imported++
|
||||
@@ -1008,7 +1008,7 @@ func (s *Service) pushProjectChange(change *localdb.PendingChange) error {
|
||||
localProject.SyncStatus = "synced"
|
||||
now := time.Now()
|
||||
localProject.SyncedAt = &now
|
||||
_ = s.localDB.SaveProject(localProject)
|
||||
_ = s.localDB.SaveProjectPreservingUpdatedAt(localProject)
|
||||
}
|
||||
|
||||
return nil
|
||||
@@ -1278,7 +1278,7 @@ func (s *Service) ensureConfigurationProject(mariaDB *gorm.DB, cfg *models.Confi
|
||||
localProject.SyncStatus = "synced"
|
||||
now := time.Now()
|
||||
localProject.SyncedAt = &now
|
||||
_ = s.localDB.SaveProject(localProject)
|
||||
_ = s.localDB.SaveProjectPreservingUpdatedAt(localProject)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
41
memory.md
41
memory.md
@@ -1,41 +0,0 @@
|
||||
# Changes summary (2026-02-11)
|
||||
|
||||
Implemented strict `lot_category` flow using `pricelist_items.lot_category` only (no parsing from `lot_name`), plus local caching and backfill:
|
||||
|
||||
1. Local DB schema + migrations
|
||||
- Added `lot_category` column to `local_pricelist_items` via `LocalPricelistItem` model.
|
||||
- Added local migration `2026_02_11_local_pricelist_item_category` to add the column if missing and create indexes:
|
||||
- `idx_local_pricelist_items_pricelist_lot (pricelist_id, lot_name)`
|
||||
- `idx_local_pricelist_items_lot_category (lot_category)`
|
||||
|
||||
2. Server model/repository
|
||||
- Added `LotCategory` field to `models.PricelistItem`.
|
||||
- `PricelistRepository.GetItems` now sets `Category` from `LotCategory` (no parsing from `lot_name`).
|
||||
|
||||
3. Sync + local DB helpers
|
||||
- `SyncPricelistItems` now saves `lot_category` into local cache via `PricelistItemToLocal`.
|
||||
- Added `LocalDB.CountLocalPricelistItemsWithEmptyCategory` and `LocalDB.ReplaceLocalPricelistItems`.
|
||||
- Added `LocalDB.GetLocalLotCategoriesByServerPricelistID` for strict category lookup.
|
||||
- Added `SyncPricelists` backfill step: for used active pricelists with empty categories, force refresh items from server.
|
||||
|
||||
4. API handler
|
||||
- `GET /api/pricelists/:id/items` returns `category` from `local_pricelist_items.lot_category` (no parsing from `lot_name`).
|
||||
|
||||
5. Article category foundation
|
||||
- New package `internal/article`:
|
||||
- `ResolveLotCategoriesStrict` pulls categories from local pricelist items and errors on missing category.
|
||||
- `GroupForLotCategory` maps only allowed codes (CPU/MEM/GPU/M2/SSD/HDD/EDSFF/HHHL/NIC/HCA/DPU/PSU/PS) to article groups; excludes `SFP`.
|
||||
- Error type `MissingCategoryForLotError` with base `ErrMissingCategoryForLot`.
|
||||
|
||||
6. Tests
|
||||
- Added unit tests for converters and article category resolver.
|
||||
- Added handler test to ensure `/api/pricelists/:id/items` returns `lot_category`.
|
||||
- Added sync test for category backfill on used pricelist items.
|
||||
- `go test ./...` passed.
|
||||
|
||||
Additional fixes (2026-02-11):
|
||||
- Fixed article parsing bug: CPU/GPU parsers were swapped in `internal/article/generator.go`. CPU now uses last token from CPU lot; GPU uses model+memory from `GPU_vendor_model_mem_iface`.
|
||||
- Adjusted configurator base tab layout to align labels on the same row (separate label row + input row grid).
|
||||
|
||||
UI rule (2026-02-19):
|
||||
- In all breadcrumbs, truncate long specification/configuration names to 16 characters using ellipsis.
|
||||
18
releases/README.md
Normal file
18
releases/README.md
Normal file
@@ -0,0 +1,18 @@
|
||||
# Releases
|
||||
|
||||
This directory stores packaged release artifacts and per-release notes.
|
||||
|
||||
Rules:
|
||||
- keep release notes next to the matching release directory as `RELEASE_NOTES.md`;
|
||||
- do not keep duplicate changelog memory files elsewhere in the repository;
|
||||
- if a release directory has no notes yet, add them there instead of creating side documents.
|
||||
|
||||
Current convention:
|
||||
|
||||
```text
|
||||
releases/
|
||||
v1.5.0/
|
||||
RELEASE_NOTES.md
|
||||
SHA256SUMS.txt
|
||||
qfs-...
|
||||
```
|
||||
@@ -1,72 +0,0 @@
|
||||
# v1.2.1 Release Notes
|
||||
|
||||
**Date:** 2026-02-09
|
||||
**Changes since v1.2.0:** 2 commits
|
||||
|
||||
## Summary
|
||||
Fixed configurator component substitution by updating to work with new pricelist-based pricing model. Addresses regression from v1.2.0 refactor that removed `CurrentPrice` field from components.
|
||||
|
||||
## Commits
|
||||
|
||||
### 1. Refactor: Remove CurrentPrice from local_components (5984a57)
|
||||
**Type:** Refactor
|
||||
**Files Changed:** 11 files, +167 insertions, -194 deletions
|
||||
|
||||
#### Overview
|
||||
Transitioned from component-based pricing to pricelist-based pricing model:
|
||||
- Removed `CurrentPrice` and `SyncedAt` from LocalComponent (metadata-only now)
|
||||
- Added `WarehousePricelistID` and `CompetitorPricelistID` to LocalConfiguration
|
||||
- Removed 2 unused methods: UpdateComponentPricesFromPricelist, EnsureComponentPricesFromPricelists
|
||||
|
||||
#### Key Changes
|
||||
- **Data Model:**
|
||||
- LocalComponent: now stores only metadata (LotName, LotDescription, Category, Model)
|
||||
- LocalConfiguration: added warehouse and competitor pricelist references
|
||||
|
||||
- **Migrations:**
|
||||
- drop_component_unused_fields - removes CurrentPrice, SyncedAt columns
|
||||
- add_warehouse_competitor_pricelists - adds new pricelist fields
|
||||
|
||||
- **Quote Calculation:**
|
||||
- Updated to use pricelist_items instead of component.CurrentPrice
|
||||
- Added PricelistID field to QuoteRequest
|
||||
- Maintains offline-first behavior
|
||||
|
||||
- **API:**
|
||||
- Removed CurrentPrice from ComponentView
|
||||
- Components API no longer returns pricing
|
||||
|
||||
### 2. Fix: Load component prices via API (acf7c8a)
|
||||
**Type:** Bug Fix
|
||||
**Files Changed:** 1 file (web/templates/index.html), +66 insertions, -12 deletions
|
||||
|
||||
#### Problem
|
||||
After v1.2.0 refactor, the configurator's autocomplete was filtering out all components because it checked for the removed `current_price` field on component objects.
|
||||
|
||||
#### Solution
|
||||
Implemented on-demand price loading via API:
|
||||
- Added `ensurePricesLoaded()` function to fetch prices from `/api/quote/price-levels`
|
||||
- Added `componentPricesCache` to cache loaded prices in memory
|
||||
- Updated all 3 autocomplete modes (single, multi, section) to load prices when input is focused
|
||||
- Changed price validation from `c.current_price` to `hasComponentPrice(lot_name)`
|
||||
- Updated cart item creation to use cached API prices
|
||||
|
||||
#### Impact
|
||||
- Components without prices are still filtered out (as required)
|
||||
- Price checks now use API data instead of removed database field
|
||||
- Frontend loads prices on-demand for better performance
|
||||
|
||||
## Testing Notes
|
||||
- ✅ Configurator component substitution now works
|
||||
- ✅ Prices load correctly from pricelist
|
||||
- ✅ Offline mode still supported (prices cached after initial load)
|
||||
- ✅ Multi-pricelist support functional (estimate/warehouse/competitor)
|
||||
|
||||
## Known Issues
|
||||
None
|
||||
|
||||
## Migration Path
|
||||
No database migration needed from v1.2.0 - migrations were applied in v1.2.0 release.
|
||||
|
||||
## Breaking Changes
|
||||
None for end users. Internal: `ComponentView` no longer includes `CurrentPrice` in API responses.
|
||||
@@ -1,59 +0,0 @@
|
||||
# Release v1.2.2 (2026-02-09)
|
||||
|
||||
## Summary
|
||||
|
||||
Fixed CSV export filename inconsistency where project names weren't being resolved correctly. Standardized export format across both manual exports and project configuration exports to use `YYYY-MM-DD (project_name) config_name BOM.csv`.
|
||||
|
||||
## Commits
|
||||
|
||||
- `8f596ce` fix: standardize CSV export filename format to use project name
|
||||
|
||||
## Changes
|
||||
|
||||
### CSV Export Filename Standardization
|
||||
|
||||
**Problem:**
|
||||
- ExportCSV and ExportConfigCSV had inconsistent filename formats
|
||||
- Project names sometimes fell back to config names when not explicitly provided
|
||||
- Export timestamps didn't reflect actual price update time
|
||||
|
||||
**Solution:**
|
||||
- Unified format: `YYYY-MM-DD (project_name) config_name BOM.csv`
|
||||
- Both export paths now use PriceUpdatedAt if available, otherwise CreatedAt
|
||||
- Project name resolved from ProjectUUID via ProjectService for both paths
|
||||
- Frontend passes project_uuid context when exporting
|
||||
|
||||
**Technical Details:**
|
||||
|
||||
Backend:
|
||||
- Added `ProjectUUID` field to `ExportRequest` struct in handlers/export.go
|
||||
- Updated ExportCSV to look up project name from ProjectUUID using ProjectService
|
||||
- Ensured ExportConfigCSV gets project name from config's ProjectUUID
|
||||
- Both use CreatedAt (for ExportCSV) or PriceUpdatedAt/CreatedAt (for ExportConfigCSV)
|
||||
|
||||
Frontend:
|
||||
- Added `projectUUID` and `projectName` state variables in index.html
|
||||
- Load and store projectUUID when configuration is loaded
|
||||
- Pass `project_uuid` in JSON body for both export requests
|
||||
|
||||
## Files Modified
|
||||
|
||||
- `internal/handlers/export.go` - Project name resolution and ExportRequest update
|
||||
- `internal/handlers/export_test.go` - Updated mock initialization with projectService param
|
||||
- `cmd/qfs/main.go` - Pass projectService to ExportHandler constructor
|
||||
- `web/templates/index.html` - Add projectUUID tracking and export payload updates
|
||||
|
||||
## Testing Notes
|
||||
|
||||
✅ All existing tests updated and passing
|
||||
✅ Code builds without errors
|
||||
✅ Export filename now includes correct project name
|
||||
✅ Works for both form-based and project-based exports
|
||||
|
||||
## Breaking Changes
|
||||
|
||||
None - API response format unchanged, only filename generation updated.
|
||||
|
||||
## Known Issues
|
||||
|
||||
None identified.
|
||||
@@ -1,95 +0,0 @@
|
||||
# Release v1.2.3 (2026-02-10)
|
||||
|
||||
## Summary
|
||||
|
||||
Unified synchronization functionality with event-driven UI updates. Resolved user confusion about duplicate sync buttons by implementing a single sync source with automatic page refreshes.
|
||||
|
||||
## Changes
|
||||
|
||||
### Main Feature: Sync Event System
|
||||
|
||||
- **Added `sync-completed` event** in base.html's `syncAction()` function
|
||||
- Dispatched after successful `/api/sync/all` or `/api/sync/push`
|
||||
- Includes endpoint and response data in event detail
|
||||
- Enables pages to react automatically to sync completion
|
||||
|
||||
### Configs Page (`configs.html`)
|
||||
|
||||
- **Removed "Импорт с сервера" button** - duplicate functionality no longer needed
|
||||
- **Updated layout** - changed from 2-column grid to single button layout
|
||||
- **Removed `importConfigsFromServer()` function** - functionality now handled by navbar sync
|
||||
- **Added sync-completed event listener**:
|
||||
- Automatically reloads configurations list after sync
|
||||
- Resets pagination to first page
|
||||
- New configurations appear immediately without manual refresh
|
||||
|
||||
### Projects Page (`projects.html`)
|
||||
|
||||
- **Wrapped initialization in DOMContentLoaded**:
|
||||
- Moved `loadProjects()` and all event listeners inside handler
|
||||
- Ensures DOM is fully loaded before accessing elements
|
||||
- **Added sync-completed event listener**:
|
||||
- Automatically reloads projects list after sync
|
||||
- New projects appear immediately without manual refresh
|
||||
|
||||
### Pricelists Page (`pricelists.html`)
|
||||
|
||||
- **Added sync-completed event listener** to existing DOMContentLoaded:
|
||||
- Automatically reloads pricelists when sync completes
|
||||
- Maintains existing permissions and modal functionality
|
||||
|
||||
## Benefits
|
||||
|
||||
### User Experience
|
||||
- ✅ Single "Синхронизация" button in navbar - no confusion about sync sources
|
||||
- ✅ Automatic list updates after sync - no need for manual F5 refresh
|
||||
- ✅ Consistent behavior across all pages (configs, projects, pricelists)
|
||||
- ✅ Better feedback: toast notification + automatic UI refresh
|
||||
|
||||
### Architecture
|
||||
- ✅ Event-driven loose coupling between navbar and pages
|
||||
- ✅ Easy to extend to other pages (just add event listener)
|
||||
- ✅ No backend changes needed
|
||||
- ✅ Production-ready
|
||||
|
||||
## Breaking Changes
|
||||
|
||||
- **`/api/configs/import` endpoint** still works but UI button removed
|
||||
- Users should use navbar "Синхронизация" button instead
|
||||
- Backend API remains unchanged for backward compatibility
|
||||
|
||||
## Files Modified
|
||||
|
||||
1. `web/templates/base.html` - Added sync-completed event dispatch
|
||||
2. `web/templates/configs.html` - Event listener + removed duplicate UI
|
||||
3. `web/templates/projects.html` - DOMContentLoaded wrapper + event listener
|
||||
4. `web/templates/pricelists.html` - Event listener for auto-refresh
|
||||
|
||||
**Stats:** 4 files changed, 59 insertions(+), 65 deletions(-)
|
||||
|
||||
## Commits
|
||||
|
||||
- `99fd80b` - feat: unify sync functionality with event-driven UI updates
|
||||
|
||||
## Testing Checklist
|
||||
|
||||
- [x] Configs page: New configurations appear after navbar sync
|
||||
- [x] Projects page: New projects appear after navbar sync
|
||||
- [x] Pricelists page: Pricelists refresh after navbar sync
|
||||
- [x] Both `/api/sync/all` and `/api/sync/push` trigger updates
|
||||
- [x] Toast notifications still show correctly
|
||||
- [x] Sync status indicator updates
|
||||
- [x] Error handling (423, network errors) still works
|
||||
- [x] Mode switching (Active/Archive) works correctly
|
||||
- [x] Backward compatibility maintained
|
||||
|
||||
## Known Issues
|
||||
|
||||
None - implementation is production-ready
|
||||
|
||||
## Migration Notes
|
||||
|
||||
No migration needed. Changes are frontend-only and backward compatible:
|
||||
- Old `/api/configs/import` endpoint still functional
|
||||
- No database schema changes
|
||||
- No configuration changes needed
|
||||
@@ -1,68 +0,0 @@
|
||||
# Release v1.3.0 (2026-02-11)
|
||||
|
||||
## Summary
|
||||
|
||||
Introduced article generation with pricelist categories, added local configuration storage, and expanded sync/export capabilities. Simplified article generator compression and loosened project update constraints.
|
||||
|
||||
## Changes
|
||||
|
||||
### Main Features: Articles + Pricelist Categories
|
||||
|
||||
- **Article generation pipeline**
|
||||
- New generator and tests under `internal/article/`
|
||||
- Category support with test coverage
|
||||
- **Pricelist category integration**
|
||||
- Handler and repository updates
|
||||
- Sync backfill test for category propagation
|
||||
|
||||
### Local Configuration Storage
|
||||
|
||||
- **Local DB support**
|
||||
- New localdb models, converters, snapshots, and migrations
|
||||
- Local configuration service for cached configurations
|
||||
|
||||
### Export & UI
|
||||
|
||||
- **Export handler updates** for article data output
|
||||
- **Configs and index templates** adjusted for new article-related fields
|
||||
|
||||
### Behavior Changes
|
||||
|
||||
- **Cross-user project updates allowed**
|
||||
- Removed restriction in project service
|
||||
- **Article compression refinement**
|
||||
- Generator logic simplified to reduce complexity
|
||||
|
||||
## Breaking Changes
|
||||
|
||||
None identified. Existing APIs remain intact.
|
||||
|
||||
## Files Modified
|
||||
|
||||
1. `internal/article/*` - Article generator + categories + tests
|
||||
2. `internal/localdb/*` - Local DB models, migrations, snapshots
|
||||
3. `internal/handlers/export.go` - Export updates
|
||||
4. `internal/handlers/pricelist.go` - Category handling
|
||||
5. `internal/services/sync/service.go` - Category backfill logic
|
||||
6. `web/templates/configs.html` - Article field updates
|
||||
7. `web/templates/index.html` - Article field updates
|
||||
|
||||
**Stats:** 33 files changed, 2059 insertions(+), 329 deletions(-)
|
||||
|
||||
## Commits
|
||||
|
||||
- `5edffe8` - Add article generation and pricelist categories
|
||||
- `e355903` - Allow cross-user project updates
|
||||
- `e58fd35` - Refine article compression and simplify generator
|
||||
|
||||
## Testing Checklist
|
||||
|
||||
- [ ] Tests not run (not requested)
|
||||
|
||||
## Migration Notes
|
||||
|
||||
- New migrations:
|
||||
- `022_add_article_to_configurations.sql`
|
||||
- `023_add_server_model_to_configurations.sql`
|
||||
- `024_add_support_code_to_configurations.sql`
|
||||
- Ensure migrations are applied before running v1.3.0
|
||||
@@ -1,66 +0,0 @@
|
||||
# Release v1.3.2 (2026-02-19)
|
||||
|
||||
## Summary
|
||||
|
||||
Release focuses on stability and data integrity for local configurations. Added configuration revision history, stronger recovery for broken local sync/version states, improved sync self-healing, and clearer API error logging.
|
||||
|
||||
## Changes
|
||||
|
||||
### Configuration Revisions
|
||||
|
||||
- Added full local configuration revision flow with storage and UI support.
|
||||
- Introduced revisions page/template and backend plumbing for browsing revisions.
|
||||
- Prevented duplicate revisions when content did not actually change.
|
||||
|
||||
### Local Data Integrity and Recovery
|
||||
|
||||
- Added migration and snapshot support for local configuration version data.
|
||||
- Hardened updates for legacy/orphaned configuration rows:
|
||||
- allow update when project UUID is unchanged even if referenced project is missing locally;
|
||||
- recover gracefully when `current_version_id` is stale or version rows are missing.
|
||||
- Added regression tests for orphan-project and missing-current-version scenarios.
|
||||
|
||||
### Sync Reliability
|
||||
|
||||
- Added smart self-healing path for sync errors.
|
||||
- Fixed duplicate-project sync edge cases.
|
||||
|
||||
### API and Logging
|
||||
|
||||
- Improved HTTP error mapping for configuration updates (`404/403` instead of generic `500` in known cases).
|
||||
- Enhanced request logger to capture error responses (status, response body snippet, gin errors) for failed requests.
|
||||
|
||||
### UI and Export
|
||||
|
||||
- Updated project detail and index templates for revisions and related UX improvements.
|
||||
- Updated export pipeline and tests to align with revisions/project behavior changes.
|
||||
|
||||
## Breaking Changes
|
||||
|
||||
None identified.
|
||||
|
||||
## Files Changed
|
||||
|
||||
- 24 files changed, 2394 insertions(+), 482 deletions(-)
|
||||
- Main touched areas:
|
||||
- `internal/services/local_configuration.go`
|
||||
- `internal/services/local_configuration_versioning_test.go`
|
||||
- `internal/localdb/{localdb.go,migrations.go,snapshots.go,local_migrations_test.go}`
|
||||
- `internal/services/export.go`
|
||||
- `cmd/qfs/main.go`
|
||||
- `web/templates/{config_revisions.html,project_detail.html,index.html,base.html}`
|
||||
|
||||
## Commits Included (`v1.3.1..v1.3.2`)
|
||||
|
||||
- `b153afb` - Add smart self-healing for sync errors
|
||||
- `8508ee2` - Fix sync errors for duplicate projects and add modal scrolling
|
||||
- `2e973b6` - Add configuration revisions system and project variant deletion
|
||||
- `71f73e2` - chore: save current changes
|
||||
- `cbaeafa` - Deduplicate configuration revisions and update revisions UI
|
||||
- `075fc70` - Harden local config updates and error logging
|
||||
|
||||
## Testing
|
||||
|
||||
- [x] Targeted tests for local configuration update/version recovery:
|
||||
- `go test ./internal/services -run 'TestUpdateNoAuth(AllowsOrphanProjectWhenUUIDUnchanged|RecoversWhenCurrentVersionMissing|KeepsProjectWhenProjectUUIDOmitted)$'`
|
||||
- [ ] Full regression suite not run in this release step.
|
||||
@@ -1,89 +1,20 @@
|
||||
# QuoteForge v1.2.1
|
||||
|
||||
**Дата релиза:** 2026-02-09
|
||||
**Тег:** `v1.2.1`
|
||||
**GitHub:** https://git.mchus.pro/mchus/QuoteForge/releases/tag/v1.2.1
|
||||
Дата релиза: 2026-02-09
|
||||
Тег: `v1.2.1`
|
||||
|
||||
## Резюме
|
||||
## Ключевые изменения
|
||||
|
||||
Быстрый патч-релиз, исправляющий регрессию в конфигураторе после рефактора v1.2.0. После удаления поля `CurrentPrice` из компонентов, autocomplete перестал показывать компоненты. Теперь используется на-demand загрузка цен через API.
|
||||
- исправлена регрессия autocomplete после отказа от `CurrentPrice` в компонентах;
|
||||
- цены компонентов подгружаются через `/api/quote/price-levels`;
|
||||
- подготовлена сопровождающая release documentation.
|
||||
|
||||
## Что исправлено
|
||||
## Коммиты релиза
|
||||
|
||||
### 🐛 Configurator Component Substitution (acf7c8a)
|
||||
- **Проблема:** После рефактора в v1.2.0, autocomplete фильтровал ВСЕ компоненты, потому что проверял удаленное поле `current_price`
|
||||
- **Решение:** Загрузка цен на-demand через `/api/quote/price-levels`
|
||||
- Добавлен `componentPricesCache` для кэширования цен в памяти
|
||||
- Функция `ensurePricesLoaded()` загружает цены при фокусе на поле поиска
|
||||
- Все 3 режима autocomplete (single, multi, section) обновлены
|
||||
- Компоненты без цен по-прежнему фильтруются (как требуется), но проверка использует API
|
||||
- **Затронутые файлы:** `web/templates/index.html` (+66 строк, -12 строк)
|
||||
- `acf7c8a` fix: load component prices via API instead of removed current_price field
|
||||
- `5984a57` refactor: remove CurrentPrice from local_components and transition to pricelist-based pricing
|
||||
- `8fd27d1` docs: update v1.2.1 release notes with full changelog
|
||||
|
||||
## История v1.2.0 → v1.2.1
|
||||
## Совместимость
|
||||
|
||||
Всего коммитов: **2**
|
||||
|
||||
| Хеш | Автор | Сообщение |
|
||||
|-----|-------|-----------|
|
||||
| `acf7c8a` | Claude | fix: load component prices via API instead of removed current_price field |
|
||||
| `5984a57` | Claude | refactor: remove CurrentPrice from local_components and transition to pricelist-based pricing |
|
||||
|
||||
## Тестирование
|
||||
|
||||
✅ Configurator component substitution работает
|
||||
✅ Цены загружаются корректно из pricelist
|
||||
✅ Offline режим поддерживается (цены кэшируются после первой загрузки)
|
||||
✅ Multi-pricelist поддержка функциональна (estimate/warehouse/competitor)
|
||||
|
||||
## Breaking Changes
|
||||
|
||||
Нет критических изменений для конечных пользователей.
|
||||
|
||||
⚠️ **Для разработчиков:** `ComponentView` API больше не возвращает `CurrentPrice`.
|
||||
|
||||
## Миграция
|
||||
|
||||
Не требуется миграция БД — все миграции были применены в v1.2.0.
|
||||
|
||||
## Установка
|
||||
|
||||
### macOS
|
||||
|
||||
```bash
|
||||
# Скачать и распаковать
|
||||
tar xzf qfs-v1.2.1-darwin-arm64.tar.gz # для Apple Silicon
|
||||
# или
|
||||
tar xzf qfs-v1.2.1-darwin-amd64.tar.gz # для Intel Mac
|
||||
|
||||
# Снять ограничение Gatekeeper (если требуется)
|
||||
xattr -d com.apple.quarantine ./qfs
|
||||
|
||||
# Запустить
|
||||
./qfs
|
||||
```
|
||||
|
||||
### Linux
|
||||
|
||||
```bash
|
||||
tar xzf qfs-v1.2.1-linux-amd64.tar.gz
|
||||
./qfs
|
||||
```
|
||||
|
||||
### Windows
|
||||
|
||||
```bash
|
||||
# Распаковать qfs-v1.2.1-windows-amd64.zip
|
||||
# Запустить qfs.exe
|
||||
```
|
||||
|
||||
## Известные проблемы
|
||||
|
||||
Нет известных проблем на момент релиза.
|
||||
|
||||
## Поддержка
|
||||
|
||||
По вопросам обращайтесь: [@mchus](https://git.mchus.pro/mchus)
|
||||
|
||||
---
|
||||
|
||||
*Отправлено с ❤️ через Claude Code*
|
||||
- дополнительных миграций поверх `v1.2.0` не требуется.
|
||||
|
||||
25
releases/v1.5.3/RELEASE_NOTES.md
Normal file
25
releases/v1.5.3/RELEASE_NOTES.md
Normal file
@@ -0,0 +1,25 @@
|
||||
# QuoteForge v1.5.3
|
||||
|
||||
Дата релиза: 2026-03-15
|
||||
Тег: `v1.5.3`
|
||||
|
||||
## Ключевые изменения
|
||||
|
||||
- документация проекта очищена и приведена к одному формату;
|
||||
- `bible-local/` сокращён до актуальных архитектурных контрактов без исторического шума;
|
||||
- удалены временные заметки и дублирующий changelog в `releases/memory`;
|
||||
- runtime config упрощён: из активной схемы убраны мёртвые секции, оставлены только используемые части.
|
||||
|
||||
## Затронутые области
|
||||
|
||||
- корневой `README.md`;
|
||||
- весь `bible-local/`;
|
||||
- `config.example.yaml`;
|
||||
- `internal/config/config.go`;
|
||||
- release notes и правила их хранения в `releases/`.
|
||||
|
||||
## Совместимость
|
||||
|
||||
- релиз не меняет пользовательскую модель данных;
|
||||
- локальные и серверные миграции не требуются;
|
||||
- основное изменение касается документации и формы runtime-конфига.
|
||||
33
releases/v1.5.4/RELEASE_NOTES.md
Normal file
33
releases/v1.5.4/RELEASE_NOTES.md
Normal file
@@ -0,0 +1,33 @@
|
||||
# QuoteForge v1.5.4
|
||||
|
||||
Дата релиза: 2026-03-16
|
||||
Тег: `v1.5.4`
|
||||
|
||||
## Ключевые изменения
|
||||
|
||||
- runtime автоматически нормализует `server.host` к `127.0.0.1` и переписывает некорректный локальный конфиг;
|
||||
- добавлены действия с вариантом и унифицированы правила именования `_копия` для вариантов и конфигураций;
|
||||
- исправлен CSV-экспорт прайсинговых таблиц в конфигураторе под Excel-совместимый формат;
|
||||
- таблица проектов переработана: новая колонка даты, tooltip с деталями, отдельный автор, компактные действия и ссылка на трекер;
|
||||
- sync больше не подменяет `updated_at` проектов временем синхронизации;
|
||||
- добавлена одноразовая утилита `cmd/migrate_project_updated_at` для пересинхронизации `updated_at` проектов из MariaDB в локальную SQLite.
|
||||
- `scripts/release.sh` больше не затирает существующий `RELEASE_NOTES.md`.
|
||||
|
||||
## Затронутые области
|
||||
|
||||
- `cmd/qfs/`;
|
||||
- `cmd/migrate_project_updated_at/`;
|
||||
- `internal/localdb/`;
|
||||
- `internal/services/project.go`;
|
||||
- `internal/services/sync/service.go`;
|
||||
- `web/templates/index.html`;
|
||||
- `web/templates/project_detail.html`;
|
||||
- `web/templates/projects.html`;
|
||||
- `web/templates/configs.html`;
|
||||
- `bible-local/`.
|
||||
|
||||
## Совместимость
|
||||
|
||||
- схема данных не меняется;
|
||||
- серверные SQL-миграции не требуются;
|
||||
- для уже испорченных локальных дат проектов можно один раз запустить `go run ./cmd/migrate_project_updated_at -apply`.
|
||||
@@ -21,13 +21,14 @@ fi
|
||||
echo -e "${GREEN}Building QuoteForge version: ${VERSION}${NC}"
|
||||
echo ""
|
||||
|
||||
# Create release directory
|
||||
RELEASE_DIR="releases/${VERSION}"
|
||||
mkdir -p "${RELEASE_DIR}"
|
||||
ensure_release_notes() {
|
||||
local notes_path="$1"
|
||||
if [ -f "${notes_path}" ]; then
|
||||
echo -e "${GREEN} ✓ Preserving existing RELEASE_NOTES.md${NC}"
|
||||
return
|
||||
fi
|
||||
|
||||
# Create release notes template (always include macOS Gatekeeper note)
|
||||
if [ ! -f "${RELEASE_DIR}/RELEASE_NOTES.md" ]; then
|
||||
cat > "${RELEASE_DIR}/RELEASE_NOTES.md" <<EOF
|
||||
cat > "${notes_path}" <<EOF
|
||||
# QuoteForge ${VERSION}
|
||||
|
||||
Дата релиза: $(date +%Y-%m-%d)
|
||||
@@ -42,7 +43,15 @@ cat > "${RELEASE_DIR}/RELEASE_NOTES.md" <<EOF
|
||||
Снимите карантинный атрибут через терминал: \`xattr -d com.apple.quarantine /path/to/qfs-darwin-arm64\`
|
||||
После этого бинарник запустится без предупреждения Gatekeeper.
|
||||
EOF
|
||||
fi
|
||||
echo -e "${GREEN} ✓ Created RELEASE_NOTES.md template${NC}"
|
||||
}
|
||||
|
||||
# Create release directory
|
||||
RELEASE_DIR="releases/${VERSION}"
|
||||
mkdir -p "${RELEASE_DIR}"
|
||||
|
||||
# Create release notes template only when missing.
|
||||
ensure_release_notes "${RELEASE_DIR}/RELEASE_NOTES.md"
|
||||
|
||||
# Build for all platforms
|
||||
echo -e "${YELLOW}→ Building binaries...${NC}"
|
||||
|
||||
78
todo.md
78
todo.md
@@ -1,78 +0,0 @@
|
||||
# QuoteForge — План очистки (удаление admin pricing)
|
||||
|
||||
Цель: убрать всё, что связано с администрированием цен, складскими справками, алертами.
|
||||
Оставить: конфигуратор, проекты, read-only просмотр прайслистов, sync, offline-first.
|
||||
|
||||
---
|
||||
|
||||
## 1. Удалить файлы
|
||||
|
||||
- [x] `internal/handlers/pricing.go` (40.6KB) — весь admin pricing UI
|
||||
- [x] `internal/services/pricing/` — весь пакет расчёта цен
|
||||
- [x] `internal/services/pricelist/` — весь пакет управления прайслистами
|
||||
- [x] `internal/services/stock_import.go` — импорт складских справок
|
||||
- [x] `internal/services/alerts/` — весь пакет алертов
|
||||
- [x] `internal/warehouse/` — алгоритмы расчёта цен по складу
|
||||
- [x] `web/templates/admin_pricing.html` (109KB) — страница admin pricing
|
||||
- [x] `cmd/cron/` — cron jobs (cleanup-pricelists, update-prices, update-popularity)
|
||||
- [x] `cmd/importer/` — утилита импорта данных
|
||||
|
||||
## 2. Упростить `internal/handlers/pricelist.go` (read-only)
|
||||
|
||||
Read-only методы (List, Get, GetItems, GetLotNames, GetLatest) уже работают
|
||||
только через `h.localDB` (SQLite) без `pricelist.Service`.
|
||||
|
||||
- [x] Убрать поле `service *pricelist.Service` из структуры `PricelistHandler`
|
||||
- [x] Изменить конструктор: `NewPricelistHandler(localDB *localdb.LocalDB)`
|
||||
- [x] Удалить write-методы: `Create()`, `CreateWithProgress()`, `Delete()`, `SetActive()`, `CanWrite()`
|
||||
- [x] Удалить метод `refreshLocalPricelistCacheFromServer()` (зависит от service)
|
||||
- [x] Удалить import `pricelist` пакета
|
||||
- [x] Оставить: `List()`, `Get()`, `GetItems()`, `GetLotNames()`, `GetLatest()`
|
||||
|
||||
## 3. Упростить `cmd/qfs/main.go`
|
||||
|
||||
- [x] Удалить создание сервисов: `pricingService`, `alertService`, `pricelistService`, `stockImportService`
|
||||
- [x] Удалить хэндлер: `pricingHandler`
|
||||
- [x] Изменить создание `pricelistHandler`: `NewPricelistHandler(local)` (без service)
|
||||
- [x] Удалить repositories: `priceRepo`, `alertRepo` (statsRepo оставить — nil-safe)
|
||||
- [x] Удалить все routes `/api/admin/pricing/*` (строки ~1407-1430)
|
||||
- [x] Из `/api/pricelists/*` оставить только read-only:
|
||||
- `GET ""` (List), `GET "/latest"`, `GET "/:id"`, `GET "/:id/items"`, `GET "/:id/lots"`
|
||||
- [x] Удалить write routes: `POST ""`, `POST "/create-with-progress"`, `PATCH "/:id/active"`, `DELETE "/:id"`, `GET "/can-write"`
|
||||
- [x] Удалить web page `/admin/pricing`
|
||||
- [x] Исправить `/pricelists` — вместо redirect на admin/pricing сделать страницу
|
||||
- [x] В `QuoteService` конструкторе: передавать `nil` для `pricingService`
|
||||
- [x] Удалить imports: `pricing`, `pricelist`, `alerts` пакеты
|
||||
|
||||
## 4. Упростить `handlers/web.go`
|
||||
|
||||
- [x] Удалить из `simplePages`: `admin_pricing.html`
|
||||
- [x] Удалить метод: `AdminPricing()`
|
||||
- [x] Оставить все остальные методы включая `Pricelists()` и `PricelistDetail()`
|
||||
|
||||
## 5. Упростить `base.html` (навигация)
|
||||
|
||||
- [x] Убрать ссылку "Администратор цен"
|
||||
- [x] Добавить ссылку "Прайслисты" (на `/pricelists`)
|
||||
- [x] Оставить: "Мои проекты", "Прайслисты", sync indicator
|
||||
|
||||
## 6. Sync — оставить полностью
|
||||
|
||||
- Background worker: pull компоненты + прайслисты, push конфигурации
|
||||
- Все `/api/sync/*` endpoints остаются
|
||||
- Это ядро offline-first архитектуры
|
||||
|
||||
## 7. Верификация
|
||||
|
||||
- [x] `go build ./cmd/qfs` — компилируется
|
||||
- [x] `go vet ./...` — без ошибок
|
||||
- [ ] Запуск → `/configs` работает
|
||||
- [ ] `/pricelists` — read-only список работает
|
||||
- [ ] `/pricelists/:id` — детали работают
|
||||
- [ ] Sync с сервером работает
|
||||
- [ ] Нет ссылок на admin pricing в UI
|
||||
|
||||
## 8. Обновить CLAUDE.md
|
||||
- [x] Убрать разделы про admin pricing, stock import, alerts, cron
|
||||
- [x] Обновить API endpoints список
|
||||
- [x] Обновить описание приложения
|
||||
@@ -203,6 +203,8 @@ let projectsCache = [];
|
||||
let projectNameByUUID = {};
|
||||
let projectCodeByUUID = {};
|
||||
let projectVariantByUUID = {};
|
||||
let configProjectUUIDByUUID = {};
|
||||
let configNameByUUID = {};
|
||||
let pendingMoveConfigUUID = '';
|
||||
let pendingMoveProjectCode = '';
|
||||
let pendingCreateConfigName = '';
|
||||
@@ -343,6 +345,45 @@ function findProjectByInput(input) {
|
||||
return null;
|
||||
}
|
||||
|
||||
async function resolveUniqueConfigName(baseName, projectUUID, excludeUUID) {
|
||||
const cleanedBase = (baseName || '').trim();
|
||||
if (!cleanedBase) {
|
||||
return {error: 'Введите название'};
|
||||
}
|
||||
|
||||
let configs = [];
|
||||
if (projectUUID) {
|
||||
const resp = await fetch('/api/projects/' + projectUUID + '/configs?status=all');
|
||||
if (!resp.ok) {
|
||||
return {error: 'Не удалось проверить конфигурации проекта'};
|
||||
}
|
||||
const data = await resp.json().catch(() => ({}));
|
||||
configs = Array.isArray(data.configurations) ? data.configurations : [];
|
||||
} else {
|
||||
configs = Object.keys(configProjectUUIDByUUID)
|
||||
.filter(uuid => !configProjectUUIDByUUID[uuid])
|
||||
.map(uuid => ({uuid: uuid, name: configNameByUUID[uuid] || ''}));
|
||||
}
|
||||
|
||||
const used = new Set(
|
||||
configs
|
||||
.filter(cfg => !excludeUUID || cfg.uuid !== excludeUUID)
|
||||
.map(cfg => (cfg.name || '').trim().toLowerCase())
|
||||
);
|
||||
|
||||
if (!used.has(cleanedBase.toLowerCase())) {
|
||||
return {name: cleanedBase, changed: false};
|
||||
}
|
||||
|
||||
let candidate = cleanedBase + '_копия';
|
||||
let suffix = 2;
|
||||
while (used.has(candidate.toLowerCase())) {
|
||||
candidate = cleanedBase + '_копия' + suffix;
|
||||
suffix++;
|
||||
}
|
||||
return {name: candidate, changed: true};
|
||||
}
|
||||
|
||||
function escapeHtml(text) {
|
||||
const div = document.createElement('div');
|
||||
div.textContent = text;
|
||||
@@ -385,14 +426,23 @@ function closeRenameModal() {
|
||||
|
||||
async function renameConfig() {
|
||||
const uuid = document.getElementById('rename-uuid').value;
|
||||
const name = document.getElementById('rename-input').value.trim();
|
||||
const rawName = document.getElementById('rename-input').value.trim();
|
||||
|
||||
if (!name) {
|
||||
if (!rawName) {
|
||||
alert('Введите название');
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const result = await resolveUniqueConfigName(rawName, configProjectUUIDByUUID[uuid] || '', uuid);
|
||||
if (result.error) {
|
||||
alert(result.error);
|
||||
return;
|
||||
}
|
||||
const name = result.name;
|
||||
if (result.changed) {
|
||||
document.getElementById('rename-input').value = name;
|
||||
}
|
||||
const resp = await fetch('/api/configs/' + uuid + '/rename', {
|
||||
method: 'PATCH',
|
||||
headers: {
|
||||
@@ -416,7 +466,7 @@ async function renameConfig() {
|
||||
|
||||
function openCloneModal(uuid, currentName) {
|
||||
document.getElementById('clone-uuid').value = uuid;
|
||||
document.getElementById('clone-input').value = currentName + ' (копия)';
|
||||
document.getElementById('clone-input').value = currentName + '_копия';
|
||||
document.getElementById('clone-modal').classList.remove('hidden');
|
||||
document.getElementById('clone-modal').classList.add('flex');
|
||||
document.getElementById('clone-input').focus();
|
||||
@@ -430,14 +480,23 @@ function closeCloneModal() {
|
||||
|
||||
async function cloneConfig() {
|
||||
const uuid = document.getElementById('clone-uuid').value;
|
||||
const name = document.getElementById('clone-input').value.trim();
|
||||
const rawName = document.getElementById('clone-input').value.trim();
|
||||
|
||||
if (!name) {
|
||||
if (!rawName) {
|
||||
alert('Введите название');
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const result = await resolveUniqueConfigName(rawName, configProjectUUIDByUUID[uuid] || '', uuid);
|
||||
if (result.error) {
|
||||
alert(result.error);
|
||||
return;
|
||||
}
|
||||
const name = result.name;
|
||||
if (result.changed) {
|
||||
document.getElementById('clone-input').value = name;
|
||||
}
|
||||
const resp = await fetch('/api/configs/' + uuid + '/clone', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
@@ -851,6 +910,12 @@ async function loadConfigs() {
|
||||
}
|
||||
|
||||
const data = await resp.json();
|
||||
configProjectUUIDByUUID = {};
|
||||
configNameByUUID = {};
|
||||
(data.configurations || []).forEach(cfg => {
|
||||
configProjectUUIDByUUID[cfg.uuid] = cfg.project_uuid || '';
|
||||
configNameByUUID[cfg.uuid] = cfg.name || '';
|
||||
});
|
||||
renderConfigs(data.configurations || []);
|
||||
updatePagination(data.total);
|
||||
} catch(e) {
|
||||
|
||||
@@ -199,58 +199,106 @@
|
||||
</div><!-- end top-section-bom -->
|
||||
|
||||
<!-- Top-tab section: Ценообразование -->
|
||||
<div id="top-section-pricing" class="hidden">
|
||||
<div id="top-section-pricing" class="hidden space-y-6">
|
||||
|
||||
<!-- === Цена покупки === -->
|
||||
<div class="bg-white rounded-lg shadow p-4">
|
||||
<div id="pricing-table-container">
|
||||
<div class="overflow-x-auto">
|
||||
<table class="w-full text-sm border-collapse">
|
||||
<thead class="bg-gray-50 text-gray-700">
|
||||
<tr>
|
||||
<th class="px-3 py-2 text-left border-b">LOT</th>
|
||||
<th class="px-3 py-2 text-left border-b">PN вендора</th>
|
||||
<th class="px-3 py-2 text-left border-b">Описание</th>
|
||||
<th class="px-3 py-2 text-right border-b">Кол-во</th>
|
||||
<th class="px-3 py-2 text-right border-b">Estimate</th>
|
||||
<th class="px-3 py-2 text-right border-b">Цена проектная</th>
|
||||
<th class="px-3 py-2 text-right border-b">Склад</th>
|
||||
<th class="px-3 py-2 text-right border-b">Конк.</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="pricing-table-body">
|
||||
<tr><td colspan="8" class="px-3 py-8 text-center text-gray-400">Загрузите BOM во вкладке «BOM»</td></tr>
|
||||
</tbody>
|
||||
<tfoot id="pricing-table-foot" class="hidden bg-gray-50 font-semibold">
|
||||
<tr>
|
||||
<td colspan="4" class="px-3 py-2 text-right">Итого:</td>
|
||||
<td class="px-3 py-2 text-right" id="pricing-total-estimate">—</td>
|
||||
<td class="px-3 py-2 text-right font-bold" id="pricing-total-vendor">—</td>
|
||||
<td class="px-3 py-2 text-right" id="pricing-total-warehouse">—</td>
|
||||
<td class="px-3 py-2 text-right" id="pricing-total-competitor">—</td>
|
||||
</tr>
|
||||
</tfoot>
|
||||
</table>
|
||||
</div>
|
||||
<div class="mt-4 flex flex-wrap items-center gap-4">
|
||||
<label class="text-sm font-medium text-gray-700">Своя цена:</label>
|
||||
<input type="number" id="pricing-custom-price" step="0.01" min="0" placeholder="0.00"
|
||||
class="w-40 px-3 py-2 border rounded focus:ring-2 focus:ring-blue-500"
|
||||
oninput="onPricingCustomPriceInput()">
|
||||
<label class="text-sm font-medium text-gray-700">Uplift:</label>
|
||||
<input type="text" id="pricing-uplift" inputmode="decimal" placeholder="1,0000"
|
||||
class="w-32 px-3 py-2 border rounded focus:ring-2 focus:ring-blue-500"
|
||||
oninput="onPricingUpliftInput()">
|
||||
<button onclick="setPricingCustomPriceFromVendor()" class="px-3 py-2 bg-gray-100 text-gray-700 rounded hover:bg-gray-200 border border-gray-300 text-sm">
|
||||
Проставить цены BOM
|
||||
</button>
|
||||
<button onclick="exportPricingCSV()" class="px-3 py-2 bg-green-600 text-white rounded hover:bg-green-700 text-sm">
|
||||
Экспорт CSV
|
||||
</button>
|
||||
<span id="pricing-discount-info" class="text-sm text-gray-500 hidden">
|
||||
Скидка от Estimate: <span id="pricing-discount-pct" class="font-semibold text-green-600"></span>
|
||||
</span>
|
||||
</div>
|
||||
<div class="flex items-baseline gap-3 mb-3">
|
||||
<h3 class="text-base font-semibold text-gray-800">Цена покупки</h3>
|
||||
<span class="text-xs text-gray-400">Цены указаны за 1 шт.</span>
|
||||
</div>
|
||||
<div class="overflow-x-auto">
|
||||
<table class="w-full text-sm border-collapse">
|
||||
<thead class="bg-gray-50 text-gray-700">
|
||||
<tr>
|
||||
<th class="px-3 py-2 text-left border-b">LOT</th>
|
||||
<th class="px-3 py-2 text-left border-b">PN вендора</th>
|
||||
<th class="px-3 py-2 text-left border-b">Описание</th>
|
||||
<th class="px-3 py-2 text-right border-b">Кол-во</th>
|
||||
<th class="px-3 py-2 text-right border-b">Estimate</th>
|
||||
<th class="px-3 py-2 text-right border-b">Склад</th>
|
||||
<th class="px-3 py-2 text-right border-b">Конкуренты</th>
|
||||
<th class="px-3 py-2 text-right border-b">Ручная цена</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="pricing-body-buy">
|
||||
<tr><td colspan="8" class="px-3 py-8 text-center text-gray-400">Загрузите BOM во вкладке «BOM»</td></tr>
|
||||
</tbody>
|
||||
<tfoot id="pricing-foot-buy" class="hidden bg-gray-50 font-semibold">
|
||||
<tr>
|
||||
<td colspan="4" class="px-3 py-2 text-right">Итого:</td>
|
||||
<td class="px-3 py-2 text-right" id="pricing-total-buy-estimate">—</td>
|
||||
<td class="px-3 py-2 text-right" id="pricing-total-buy-warehouse">—</td>
|
||||
<td class="px-3 py-2 text-right" id="pricing-total-buy-competitor">—</td>
|
||||
<td class="px-3 py-2 text-right font-bold" id="pricing-total-buy-vendor">—</td>
|
||||
</tr>
|
||||
</tfoot>
|
||||
</table>
|
||||
</div>
|
||||
<div class="mt-4 flex flex-wrap items-center gap-4">
|
||||
<label class="text-sm font-medium text-gray-700">Своя цена:</label>
|
||||
<input type="number" id="pricing-custom-price-buy" step="0.01" min="0" placeholder="0.00"
|
||||
class="w-40 px-3 py-2 border rounded focus:ring-2 focus:ring-blue-500"
|
||||
oninput="onBuyCustomPriceInput()">
|
||||
<button onclick="setPricingCustomPriceFromVendor()" class="px-3 py-2 bg-gray-100 text-gray-700 rounded hover:bg-gray-200 border border-gray-300 text-sm">
|
||||
Проставить цены BOM
|
||||
</button>
|
||||
<button onclick="exportPricingCSV('buy')" class="px-3 py-2 bg-green-600 text-white rounded hover:bg-green-700 text-sm">
|
||||
Экспорт CSV
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- === Цена продажи === -->
|
||||
<div class="bg-white rounded-lg shadow p-4">
|
||||
<div class="flex items-baseline gap-3 mb-1">
|
||||
<h3 class="text-base font-semibold text-gray-800">Цена продажи</h3>
|
||||
<span class="text-xs text-gray-400">Цены указаны за 1 шт.</span>
|
||||
</div>
|
||||
<p class="text-xs text-gray-500 mb-3">Склад и Конкуренты умножаются на 1,3</p>
|
||||
<div class="overflow-x-auto">
|
||||
<table class="w-full text-sm border-collapse">
|
||||
<thead class="bg-gray-50 text-gray-700">
|
||||
<tr>
|
||||
<th class="px-3 py-2 text-left border-b">LOT</th>
|
||||
<th class="px-3 py-2 text-left border-b">PN вендора</th>
|
||||
<th class="px-3 py-2 text-left border-b">Описание</th>
|
||||
<th class="px-3 py-2 text-right border-b">Кол-во</th>
|
||||
<th class="px-3 py-2 text-right border-b">Estimate</th>
|
||||
<th class="px-3 py-2 text-right border-b">Склад</th>
|
||||
<th class="px-3 py-2 text-right border-b">Конкуренты</th>
|
||||
<th class="px-3 py-2 text-right border-b">Ручная цена</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="pricing-body-sale">
|
||||
<tr><td colspan="8" class="px-3 py-8 text-center text-gray-400">Загрузите BOM во вкладке «BOM»</td></tr>
|
||||
</tbody>
|
||||
<tfoot id="pricing-foot-sale" class="hidden bg-gray-50 font-semibold">
|
||||
<tr>
|
||||
<td colspan="4" class="px-3 py-2 text-right">Итого:</td>
|
||||
<td class="px-3 py-2 text-right" id="pricing-total-sale-estimate">—</td>
|
||||
<td class="px-3 py-2 text-right" id="pricing-total-sale-warehouse">—</td>
|
||||
<td class="px-3 py-2 text-right" id="pricing-total-sale-competitor">—</td>
|
||||
<td class="px-3 py-2 text-right font-bold" id="pricing-total-sale-vendor">—</td>
|
||||
</tr>
|
||||
</tfoot>
|
||||
</table>
|
||||
</div>
|
||||
<div class="mt-4 flex flex-wrap items-center gap-4">
|
||||
<label class="text-sm font-medium text-gray-700">Аплифт к estimate:</label>
|
||||
<input type="text" id="pricing-uplift-sale" inputmode="decimal" placeholder="1,3000"
|
||||
class="w-28 px-3 py-2 border rounded focus:ring-2 focus:ring-blue-500"
|
||||
oninput="onSaleMarkupInput()">
|
||||
<label class="text-sm font-medium text-gray-700">Своя цена:</label>
|
||||
<input type="number" id="pricing-custom-price-sale" step="0.01" min="0" placeholder="0.00"
|
||||
class="w-40 px-3 py-2 border rounded focus:ring-2 focus:ring-blue-500"
|
||||
oninput="onSaleCustomPriceInput()">
|
||||
<button onclick="exportPricingCSV('sale')" class="px-3 py-2 bg-green-600 text-white rounded hover:bg-green-700 text-sm">
|
||||
Экспорт CSV
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</div><!-- end top-section-pricing -->
|
||||
|
||||
</div>
|
||||
@@ -3481,8 +3529,10 @@ async function loadVendorSpec(configUUID) {
|
||||
// ==================== ЦЕНООБРАЗОВАНИЕ ====================
|
||||
|
||||
async function renderPricingTab() {
|
||||
const tbody = document.getElementById('pricing-table-body');
|
||||
const tfoot = document.getElementById('pricing-table-foot');
|
||||
const tbodyBuy = document.getElementById('pricing-body-buy');
|
||||
const tfootBuy = document.getElementById('pricing-foot-buy');
|
||||
const tbodySale = document.getElementById('pricing-body-sale');
|
||||
const tfootSale = document.getElementById('pricing-foot-sale');
|
||||
|
||||
const cart = window._currentCart || [];
|
||||
const compMap = {};
|
||||
@@ -3513,7 +3563,6 @@ async function renderPricingTab() {
|
||||
});
|
||||
}
|
||||
});
|
||||
// Also price LOTs that exist in current Estimate but are not covered by BOM mappings.
|
||||
cart.forEach(item => {
|
||||
if (!item?.lot_name || seen.has(item.lot_name)) return;
|
||||
seen.add(item.lot_name);
|
||||
@@ -3524,7 +3573,7 @@ async function renderPricingTab() {
|
||||
}
|
||||
|
||||
// Fetch fresh price levels for these LOTs
|
||||
const priceMap = {}; // lot_name → {estimate_price, ...}
|
||||
const priceMap = {};
|
||||
if (itemsForPriceLevels.length) {
|
||||
try {
|
||||
const payload = {
|
||||
@@ -3543,224 +3592,207 @@ async function renderPricingTab() {
|
||||
const data = await resp.json();
|
||||
(data.items || []).forEach(i => { priceMap[i.lot_name] = i; });
|
||||
}
|
||||
} catch(e) { /* silent — pricing tab renders with available data */ }
|
||||
} catch(e) { /* silent */ }
|
||||
}
|
||||
|
||||
let totalVendor = 0, totalEstimate = 0, totalWarehouse = 0, totalCompetitor = 0;
|
||||
let hasVendor = false, hasEstimate = false, hasWarehouse = false, hasCompetitor = false;
|
||||
// Sale uplift applied to estimate (default 1.3)
|
||||
const saleUplift = (() => {
|
||||
const v = parseDecimalInput(document.getElementById('pricing-uplift-sale')?.value || '');
|
||||
return v > 0 ? v : 1.3;
|
||||
})();
|
||||
const SALE_FIXED_MULT = 1.3;
|
||||
|
||||
tbody.innerHTML = '';
|
||||
|
||||
if (!bomRows.length) {
|
||||
if (!cart.length) {
|
||||
tbody.innerHTML = '<tr><td colspan="9" class="px-3 py-8 text-center text-gray-400">Нет данных для отображения</td></tr>';
|
||||
tfoot.classList.add('hidden');
|
||||
return;
|
||||
}
|
||||
cart.forEach(item => {
|
||||
const tr = document.createElement('tr');
|
||||
tr.classList.add('pricing-row');
|
||||
const pl = priceMap[item.lot_name];
|
||||
const estUnit = (pl && pl.estimate_price > 0) ? pl.estimate_price : (item.unit_price || 0);
|
||||
const warehouseUnit = (pl && pl.warehouse_price > 0) ? pl.warehouse_price : null;
|
||||
const competitorUnit = (pl && pl.competitor_price > 0) ? pl.competitor_price : null;
|
||||
const estimateTotal = estUnit * item.quantity;
|
||||
const warehouseTotal = warehouseUnit != null ? warehouseUnit * item.quantity : null;
|
||||
const competitorTotal = competitorUnit != null ? competitorUnit * item.quantity : null;
|
||||
if (estimateTotal > 0) { totalEstimate += estimateTotal; hasEstimate = true; }
|
||||
if (warehouseTotal != null && warehouseTotal > 0) { totalWarehouse += warehouseTotal; hasWarehouse = true; }
|
||||
if (competitorTotal != null && competitorTotal > 0) { totalCompetitor += competitorTotal; hasCompetitor = true; }
|
||||
tr.dataset.est = estimateTotal;
|
||||
const desc = (compMap[item.lot_name] || {}).description || '';
|
||||
tr.dataset.vendorOrig = '';
|
||||
tr.innerHTML = `
|
||||
<td class="px-3 py-1.5 text-xs">${escapeHtml(item.lot_name)}</td>
|
||||
<td class="px-3 py-1.5 text-xs text-gray-400">—</td>
|
||||
<td class="px-3 py-1.5 text-xs text-gray-500 truncate max-w-xs">${escapeHtml(desc)}</td>
|
||||
<td class="px-3 py-1.5 text-right">${item.quantity}</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs">${estimateTotal > 0 ? formatCurrency(estimateTotal) : '—'}</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs text-gray-400 pricing-vendor-price">—</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs">${warehouseTotal != null && warehouseTotal > 0 ? formatCurrency(warehouseTotal) : '—'}</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs">${competitorTotal != null && competitorTotal > 0 ? formatCurrency(competitorTotal) : '—'}</td>
|
||||
`;
|
||||
tbody.appendChild(tr);
|
||||
});
|
||||
} else {
|
||||
const coveredLots = new Set();
|
||||
bomRows.forEach(row => {
|
||||
const tr = document.createElement('tr');
|
||||
tr.classList.add('pricing-row');
|
||||
const baseLot = rowBaseLot(row);
|
||||
const allocs = _getRowAllocations(row).filter(a => a.lot_name && _bomLotValid(a.lot_name) && a.quantity >= 1);
|
||||
if (baseLot) coveredLots.add(baseLot);
|
||||
allocs.forEach(a => coveredLots.add(a.lot_name));
|
||||
const hasMapping = !!baseLot || allocs.length > 0;
|
||||
const isUnresolved = !hasMapping;
|
||||
|
||||
let rowEst = 0;
|
||||
let hasEstimateForRow = false;
|
||||
let rowWarehouse = 0;
|
||||
let hasWarehouseForRow = false;
|
||||
let rowCompetitor = 0;
|
||||
let hasCompetitorForRow = false;
|
||||
if (baseLot) {
|
||||
const pl = priceMap[baseLot];
|
||||
const estimateUnit = (pl && pl.estimate_price > 0) ? pl.estimate_price : null;
|
||||
const warehouseUnit = (pl && pl.warehouse_price > 0) ? pl.warehouse_price : null;
|
||||
const competitorUnit = (pl && pl.competitor_price > 0) ? pl.competitor_price : null;
|
||||
if (estimateUnit != null) {
|
||||
rowEst += estimateUnit * row.quantity * _getRowLotQtyPerPN(row);
|
||||
hasEstimateForRow = true;
|
||||
}
|
||||
if (warehouseUnit != null) {
|
||||
rowWarehouse += warehouseUnit * row.quantity * _getRowLotQtyPerPN(row);
|
||||
hasWarehouseForRow = true;
|
||||
}
|
||||
if (competitorUnit != null) {
|
||||
rowCompetitor += competitorUnit * row.quantity * _getRowLotQtyPerPN(row);
|
||||
hasCompetitorForRow = true;
|
||||
}
|
||||
}
|
||||
allocs.forEach(a => {
|
||||
const pl = priceMap[a.lot_name];
|
||||
const estimateUnit = (pl && pl.estimate_price > 0) ? pl.estimate_price : null;
|
||||
const warehouseUnit = (pl && pl.warehouse_price > 0) ? pl.warehouse_price : null;
|
||||
const competitorUnit = (pl && pl.competitor_price > 0) ? pl.competitor_price : null;
|
||||
if (estimateUnit != null) {
|
||||
rowEst += estimateUnit * row.quantity * a.quantity;
|
||||
hasEstimateForRow = true;
|
||||
}
|
||||
if (warehouseUnit != null) {
|
||||
rowWarehouse += warehouseUnit * row.quantity * a.quantity;
|
||||
hasWarehouseForRow = true;
|
||||
}
|
||||
if (competitorUnit != null) {
|
||||
rowCompetitor += competitorUnit * row.quantity * a.quantity;
|
||||
hasCompetitorForRow = true;
|
||||
}
|
||||
});
|
||||
|
||||
const vendorTotal = row.total_price != null ? row.total_price : (row.unit_price != null ? row.unit_price * row.quantity : null);
|
||||
if (vendorTotal != null) { totalVendor += vendorTotal; hasVendor = true; }
|
||||
if (hasEstimateForRow) { totalEstimate += rowEst; hasEstimate = true; }
|
||||
if (hasWarehouseForRow) { totalWarehouse += rowWarehouse; hasWarehouse = true; }
|
||||
if (hasCompetitorForRow) { totalCompetitor += rowCompetitor; hasCompetitor = true; }
|
||||
|
||||
tr.dataset.est = rowEst;
|
||||
tr.dataset.vendorOrig = vendorTotal != null ? vendorTotal : '';
|
||||
const desc = row.description || (baseLot ? ((compMap[baseLot] || {}).description || '') : '');
|
||||
let lotCell = '<span class="text-red-500">н/д</span>';
|
||||
if (baseLot && allocs.length) {
|
||||
lotCell = `${escapeHtml(baseLot)} <span class="text-gray-400">+${allocs.length}</span>`;
|
||||
} else if (baseLot) {
|
||||
lotCell = escapeHtml(baseLot);
|
||||
} else if (allocs.length) {
|
||||
lotCell = `${escapeHtml(allocs[0].lot_name)}${allocs.length > 1 ? ` <span class="text-gray-400">+${allocs.length - 1}</span>` : ''}`;
|
||||
}
|
||||
tr.innerHTML = `
|
||||
<td class="px-3 py-1.5 text-xs">${lotCell}</td>
|
||||
<td class="px-3 py-1.5 font-mono text-xs">${escapeHtml(row.vendor_pn)}</td>
|
||||
<td class="px-3 py-1.5 text-xs text-gray-500 truncate max-w-xs">${escapeHtml(desc)}</td>
|
||||
<td class="px-3 py-1.5 text-right">${row.quantity}</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs">${hasEstimateForRow ? formatCurrency(rowEst) : '—'}</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs pricing-vendor-price">${vendorTotal != null ? formatCurrency(vendorTotal) : '—'}</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs">${hasWarehouseForRow ? formatCurrency(rowWarehouse) : '—'}</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs">${hasCompetitorForRow ? formatCurrency(rowCompetitor) : '—'}</td>
|
||||
`;
|
||||
tbody.appendChild(tr);
|
||||
});
|
||||
|
||||
// Append Estimate-only LOTs that were counted in cart but not mapped from BOM.
|
||||
cart.forEach(item => {
|
||||
if (!item?.lot_name || coveredLots.has(item.lot_name)) return;
|
||||
const tr = document.createElement('tr');
|
||||
tr.classList.add('pricing-row');
|
||||
tr.classList.add('bg-blue-50');
|
||||
const pl = priceMap[item.lot_name];
|
||||
const estUnit = (pl && pl.estimate_price > 0) ? pl.estimate_price : (item.unit_price || 0);
|
||||
const warehouseUnit = (pl && pl.warehouse_price > 0) ? pl.warehouse_price : null;
|
||||
const competitorUnit = (pl && pl.competitor_price > 0) ? pl.competitor_price : null;
|
||||
const estimateTotal = estUnit * item.quantity;
|
||||
const warehouseTotal = warehouseUnit != null ? warehouseUnit * item.quantity : null;
|
||||
const competitorTotal = competitorUnit != null ? competitorUnit * item.quantity : null;
|
||||
if (estimateTotal > 0) { totalEstimate += estimateTotal; hasEstimate = true; }
|
||||
if (warehouseTotal != null && warehouseTotal > 0) { totalWarehouse += warehouseTotal; hasWarehouse = true; }
|
||||
if (competitorTotal != null && competitorTotal > 0) { totalCompetitor += competitorTotal; hasCompetitor = true; }
|
||||
tr.dataset.est = estimateTotal;
|
||||
tr.dataset.vendorOrig = '';
|
||||
const desc = (compMap[item.lot_name] || {}).description || '';
|
||||
tr.innerHTML = `
|
||||
<td class="px-3 py-1.5 text-xs">${escapeHtml(item.lot_name)}</td>
|
||||
<td class="px-3 py-1.5 text-xs text-gray-400">—</td>
|
||||
<td class="px-3 py-1.5 text-xs text-gray-500 truncate max-w-xs">${escapeHtml(desc)}</td>
|
||||
<td class="px-3 py-1.5 text-right">${item.quantity}</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs">${estimateTotal > 0 ? formatCurrency(estimateTotal) : '—'}</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs text-gray-400 pricing-vendor-price">—</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs">${warehouseTotal != null && warehouseTotal > 0 ? formatCurrency(warehouseTotal) : '—'}</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs">${competitorTotal != null && competitorTotal > 0 ? formatCurrency(competitorTotal) : '—'}</td>
|
||||
`;
|
||||
tbody.appendChild(tr);
|
||||
});
|
||||
}
|
||||
|
||||
// Totals row
|
||||
document.getElementById('pricing-total-vendor').textContent = hasVendor ? formatCurrency(totalVendor) : '—';
|
||||
document.getElementById('pricing-total-estimate').textContent = hasEstimate ? formatCurrency(totalEstimate) : '—';
|
||||
document.getElementById('pricing-total-warehouse').textContent = hasWarehouse ? formatCurrency(totalWarehouse) : '—';
|
||||
document.getElementById('pricing-total-competitor').textContent = hasCompetitor ? formatCurrency(totalCompetitor) : '—';
|
||||
tfoot.classList.remove('hidden');
|
||||
|
||||
// Update custom price proportional breakdown
|
||||
onPricingCustomPriceInput();
|
||||
}
|
||||
|
||||
function setPricingCustomPriceFromVendor() {
|
||||
// Apply per-row BOM prices directly (not proportional redistribution)
|
||||
const rows = document.querySelectorAll('#pricing-table-body tr.pricing-row');
|
||||
const vendorCells = document.querySelectorAll('#pricing-table-body .pricing-vendor-price');
|
||||
let total = 0;
|
||||
let hasAny = false;
|
||||
|
||||
rows.forEach((tr, i) => {
|
||||
const cell = vendorCells[i];
|
||||
if (!cell) return;
|
||||
const orig = tr.dataset.vendorOrig;
|
||||
if (orig !== '') {
|
||||
const v = parseFloat(orig);
|
||||
cell.textContent = formatCurrency(v);
|
||||
cell.classList.remove('text-blue-700', 'text-gray-400');
|
||||
total += v;
|
||||
hasAny = true;
|
||||
} else {
|
||||
cell.textContent = '—';
|
||||
cell.classList.add('text-gray-400');
|
||||
cell.classList.remove('text-blue-700');
|
||||
}
|
||||
// Helper: returns unit prices from pricelist for a single LOT
|
||||
const _getUnitPrices = (pl) => ({
|
||||
estUnit: (pl && pl.estimate_price > 0) ? pl.estimate_price : 0,
|
||||
warehouseUnit: (pl && pl.warehouse_price > 0) ? pl.warehouse_price : null,
|
||||
competitorUnit: (pl && pl.competitor_price > 0) ? pl.competitor_price : null,
|
||||
});
|
||||
|
||||
document.getElementById('pricing-total-vendor').textContent = hasAny ? formatCurrency(total) : '—';
|
||||
document.getElementById('pricing-custom-price').value = hasAny ? total.toFixed(2) : '';
|
||||
syncPricingLinkedInputs('price');
|
||||
// ─── Build shared row data (unit prices for display, totals for math) ────
|
||||
const _buildRows = () => {
|
||||
const result = [];
|
||||
const coveredLots = new Set();
|
||||
|
||||
// Update discount info only
|
||||
const rows2 = document.querySelectorAll('#pricing-table-body tr.pricing-row');
|
||||
let estimateTotal = 0;
|
||||
rows2.forEach(tr => { estimateTotal += parseFloat(tr.dataset.est) || 0; });
|
||||
const discountEl = document.getElementById('pricing-discount-info');
|
||||
const pctEl = document.getElementById('pricing-discount-pct');
|
||||
if (hasAny && total > 0 && estimateTotal > 0) {
|
||||
pctEl.textContent = ((estimateTotal - total) / estimateTotal * 100).toFixed(1) + '%';
|
||||
discountEl.classList.remove('hidden');
|
||||
const _pushCartRow = (item, isEstOnly) => {
|
||||
const pl = priceMap[item.lot_name];
|
||||
const u = _getUnitPrices(pl);
|
||||
const estUnit = u.estUnit > 0 ? u.estUnit : (item.unit_price || 0);
|
||||
result.push({
|
||||
lotCell: escapeHtml(item.lot_name), vendorPN: null,
|
||||
desc: (compMap[item.lot_name] || {}).description || '',
|
||||
qty: item.quantity,
|
||||
estUnit, warehouseUnit: u.warehouseUnit, competitorUnit: u.competitorUnit,
|
||||
est: estUnit * item.quantity,
|
||||
warehouse: u.warehouseUnit != null ? u.warehouseUnit * item.quantity : null,
|
||||
competitor: u.competitorUnit != null ? u.competitorUnit * item.quantity : null,
|
||||
vendorOrig: null, vendorOrigUnit: null, isEstOnly,
|
||||
});
|
||||
};
|
||||
|
||||
if (!bomRows.length) {
|
||||
cart.forEach(item => { _pushCartRow(item, false); coveredLots.add(item.lot_name); });
|
||||
return { result, coveredLots };
|
||||
}
|
||||
|
||||
bomRows.forEach(row => {
|
||||
const baseLot = rowBaseLot(row);
|
||||
const allocs = _getRowAllocations(row).filter(a => a.lot_name && _bomLotValid(a.lot_name) && a.quantity >= 1);
|
||||
if (baseLot) coveredLots.add(baseLot);
|
||||
allocs.forEach(a => coveredLots.add(a.lot_name));
|
||||
|
||||
// Accumulate unit prices per 1 vendor PN (base + allocs)
|
||||
let rowEstUnit = 0, rowWhUnit = 0, rowCompUnit = 0;
|
||||
let hasEst = false, hasWh = false, hasComp = false;
|
||||
if (baseLot) {
|
||||
const u = _getUnitPrices(priceMap[baseLot]);
|
||||
const lotQty = _getRowLotQtyPerPN(row);
|
||||
if (u.estUnit > 0) { rowEstUnit += u.estUnit * lotQty; hasEst = true; }
|
||||
if (u.warehouseUnit != null) { rowWhUnit += u.warehouseUnit * lotQty; hasWh = true; }
|
||||
if (u.competitorUnit != null) { rowCompUnit += u.competitorUnit * lotQty; hasComp = true; }
|
||||
}
|
||||
allocs.forEach(a => {
|
||||
const u = _getUnitPrices(priceMap[a.lot_name]);
|
||||
if (u.estUnit > 0) { rowEstUnit += u.estUnit * a.quantity; hasEst = true; }
|
||||
if (u.warehouseUnit != null) { rowWhUnit += u.warehouseUnit * a.quantity; hasWh = true; }
|
||||
if (u.competitorUnit != null) { rowCompUnit += u.competitorUnit * a.quantity; hasComp = true; }
|
||||
});
|
||||
|
||||
let lotCell = '<span class="text-red-500">н/д</span>';
|
||||
if (baseLot && allocs.length) lotCell = `${escapeHtml(baseLot)} <span class="text-gray-400">+${allocs.length}</span>`;
|
||||
else if (baseLot) lotCell = escapeHtml(baseLot);
|
||||
else if (allocs.length) lotCell = `${escapeHtml(allocs[0].lot_name)}${allocs.length > 1 ? ` <span class="text-gray-400">+${allocs.length - 1}</span>` : ''}`;
|
||||
|
||||
const vendorOrigUnit = row.unit_price != null ? row.unit_price
|
||||
: (row.total_price != null && row.quantity > 0 ? row.total_price / row.quantity : null);
|
||||
const vendorOrig = row.total_price != null ? row.total_price
|
||||
: (row.unit_price != null ? row.unit_price * row.quantity : null);
|
||||
const desc = row.description || (baseLot ? ((compMap[baseLot] || {}).description || '') : '');
|
||||
result.push({
|
||||
lotCell, vendorPN: row.vendor_pn, desc, qty: row.quantity,
|
||||
estUnit: hasEst ? rowEstUnit : 0,
|
||||
warehouseUnit: hasWh ? rowWhUnit : null,
|
||||
competitorUnit: hasComp ? rowCompUnit : null,
|
||||
est: hasEst ? rowEstUnit * row.quantity : 0,
|
||||
warehouse: hasWh ? rowWhUnit * row.quantity : null,
|
||||
competitor: hasComp ? rowCompUnit * row.quantity : null,
|
||||
vendorOrig, vendorOrigUnit, isEstOnly: false,
|
||||
});
|
||||
});
|
||||
|
||||
// Estimate-only LOTs (cart items not covered by BOM)
|
||||
cart.forEach(item => {
|
||||
if (!item?.lot_name || coveredLots.has(item.lot_name)) return;
|
||||
_pushCartRow(item, true);
|
||||
coveredLots.add(item.lot_name);
|
||||
});
|
||||
|
||||
return { result, coveredLots };
|
||||
};
|
||||
|
||||
const { result: rowData } = _buildRows();
|
||||
|
||||
// ─── Populate Buy table ──────────────────────────────────────────────────
|
||||
tbodyBuy.innerHTML = '';
|
||||
if (!rowData.length) {
|
||||
tbodyBuy.innerHTML = '<tr><td colspan="8" class="px-3 py-8 text-center text-gray-400">Нет данных для отображения</td></tr>';
|
||||
tfootBuy.classList.add('hidden');
|
||||
} else {
|
||||
discountEl.classList.add('hidden');
|
||||
let totEst = 0, totWh = 0, totComp = 0, totVendor = 0;
|
||||
let hasEst = false, hasWh = false, hasComp = false, hasVendor = false;
|
||||
let cntWh = 0, cntComp = 0;
|
||||
rowData.forEach(r => {
|
||||
const tr = document.createElement('tr');
|
||||
tr.classList.add('pricing-row-buy');
|
||||
if (r.isEstOnly) tr.classList.add('bg-blue-50');
|
||||
tr.dataset.est = r.est;
|
||||
tr.dataset.qty = r.qty;
|
||||
tr.dataset.vendorOrig = r.vendorOrig != null ? r.vendorOrig : '';
|
||||
tr.dataset.vendorOrigUnit = r.vendorOrigUnit != null ? r.vendorOrigUnit : '';
|
||||
if (r.est > 0) { totEst += r.est; hasEst = true; }
|
||||
if (r.warehouse != null) { totWh += r.warehouse; hasWh = true; cntWh++; }
|
||||
if (r.competitor != null) { totComp += r.competitor; hasComp = true; cntComp++; }
|
||||
if (r.vendorOrig != null) { totVendor += r.vendorOrig; hasVendor = true; }
|
||||
tr.innerHTML = `
|
||||
<td class="px-3 py-1.5 text-xs">${r.lotCell}</td>
|
||||
<td class="px-3 py-1.5 font-mono text-xs ${r.vendorPN == null ? 'text-gray-400' : ''}">${r.vendorPN != null ? escapeHtml(r.vendorPN) : '—'}</td>
|
||||
<td class="px-3 py-1.5 text-xs text-gray-500 truncate max-w-xs">${escapeHtml(r.desc)}</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs">${r.qty}</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs">${r.estUnit > 0 ? formatCurrency(r.estUnit) : '—'}</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs">${r.warehouseUnit != null ? formatCurrency(r.warehouseUnit) : '—'}</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs">${r.competitorUnit != null ? formatCurrency(r.competitorUnit) : '—'}</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs pricing-vendor-price-buy ${r.vendorOrigUnit == null ? 'text-gray-400' : ''}">${r.vendorOrigUnit != null ? formatCurrency(r.vendorOrigUnit) : '—'}</td>
|
||||
`;
|
||||
tbodyBuy.appendChild(tr);
|
||||
});
|
||||
document.getElementById('pricing-total-buy-estimate').textContent = hasEst ? formatCurrency(totEst) : '—';
|
||||
document.getElementById('pricing-total-buy-vendor').textContent = hasVendor ? formatCurrency(totVendor) : '—';
|
||||
_setPartialTotal('pricing-total-buy-warehouse', hasWh, totWh, cntWh, rowData.length);
|
||||
_setPartialTotal('pricing-total-buy-competitor', hasComp, totComp, cntComp, rowData.length);
|
||||
tfootBuy.classList.remove('hidden');
|
||||
}
|
||||
|
||||
// ─── Populate Sale table ─────────────────────────────────────────────────
|
||||
tbodySale.innerHTML = '';
|
||||
if (!rowData.length) {
|
||||
tbodySale.innerHTML = '<tr><td colspan="8" class="px-3 py-8 text-center text-gray-400">Нет данных для отображения</td></tr>';
|
||||
tfootSale.classList.add('hidden');
|
||||
} else {
|
||||
let totEst = 0, totWh = 0, totComp = 0;
|
||||
let hasEst = false, hasWh = false, hasComp = false;
|
||||
let cntWh = 0, cntComp = 0;
|
||||
rowData.forEach(r => {
|
||||
const tr = document.createElement('tr');
|
||||
tr.classList.add('pricing-row-sale');
|
||||
if (r.isEstOnly) tr.classList.add('bg-blue-50');
|
||||
const saleEstUnit = r.estUnit > 0 ? r.estUnit * saleUplift : 0;
|
||||
const saleWhUnit = r.warehouseUnit != null ? r.warehouseUnit * SALE_FIXED_MULT : null;
|
||||
const saleCompUnit = r.competitorUnit != null ? r.competitorUnit * SALE_FIXED_MULT : null;
|
||||
const saleEstTotal = saleEstUnit * r.qty;
|
||||
const saleWhTotal = saleWhUnit != null ? saleWhUnit * r.qty : null;
|
||||
const saleCompTotal = saleCompUnit != null ? saleCompUnit * r.qty : null;
|
||||
tr.dataset.estSale = saleEstTotal;
|
||||
tr.dataset.qty = r.qty;
|
||||
if (saleEstTotal > 0) { totEst += saleEstTotal; hasEst = true; }
|
||||
if (saleWhTotal != null) { totWh += saleWhTotal; hasWh = true; cntWh++; }
|
||||
if (saleCompTotal != null) { totComp += saleCompTotal; hasComp = true; cntComp++; }
|
||||
tr.innerHTML = `
|
||||
<td class="px-3 py-1.5 text-xs">${r.lotCell}</td>
|
||||
<td class="px-3 py-1.5 font-mono text-xs ${r.vendorPN == null ? 'text-gray-400' : ''}">${r.vendorPN != null ? escapeHtml(r.vendorPN) : '—'}</td>
|
||||
<td class="px-3 py-1.5 text-xs text-gray-500 truncate max-w-xs">${escapeHtml(r.desc)}</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs">${r.qty}</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs">${saleEstUnit > 0 ? formatCurrency(saleEstUnit) : '—'}</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs">${saleWhUnit != null ? formatCurrency(saleWhUnit) : '—'}</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs">${saleCompUnit != null ? formatCurrency(saleCompUnit) : '—'}</td>
|
||||
<td class="px-3 py-1.5 text-right text-xs pricing-vendor-price-sale text-gray-400">—</td>
|
||||
`;
|
||||
tbodySale.appendChild(tr);
|
||||
});
|
||||
document.getElementById('pricing-total-sale-estimate').textContent = hasEst ? formatCurrency(totEst) : '—';
|
||||
document.getElementById('pricing-total-sale-vendor').textContent = '—';
|
||||
_setPartialTotal('pricing-total-sale-warehouse', hasWh, totWh, cntWh, rowData.length);
|
||||
_setPartialTotal('pricing-total-sale-competitor', hasComp, totComp, cntComp, rowData.length);
|
||||
tfootSale.classList.remove('hidden');
|
||||
}
|
||||
|
||||
// Restore custom prices after re-render
|
||||
applyCustomPrice('buy');
|
||||
applyCustomPrice('sale');
|
||||
}
|
||||
|
||||
function getPricingEstimateTotal() {
|
||||
const rows = document.querySelectorAll('#pricing-table-body tr.pricing-row');
|
||||
let estimateTotal = 0;
|
||||
rows.forEach(tr => { estimateTotal += parseFloat(tr.dataset.est) || 0; });
|
||||
return estimateTotal;
|
||||
// ─── Pricing helpers ─────────────────────────────────────────────────────────
|
||||
|
||||
// Sets a footer total cell. If has prices but coverage < totalRows, marks red with a hover asterisk.
|
||||
function _setPartialTotal(elId, has, total, count, totalRows) {
|
||||
const el = document.getElementById(elId);
|
||||
if (!el) return;
|
||||
el.className = el.className.replace(/\btext-red-\d+\b/g, '').trim();
|
||||
if (!has) { el.textContent = '—'; return; }
|
||||
if (count < totalRows) {
|
||||
el.innerHTML = `<span class="text-red-600">${formatCurrency(total)}</span> <span class="text-red-400 cursor-help" title="Цены указаны не для всех позиций (${count} из ${totalRows})">*</span>`;
|
||||
} else {
|
||||
el.textContent = formatCurrency(total);
|
||||
}
|
||||
}
|
||||
|
||||
function parseDecimalInput(raw) {
|
||||
@@ -3775,136 +3807,187 @@ function formatUpliftInput(value) {
|
||||
return value.toFixed(4).replace('.', ',');
|
||||
}
|
||||
|
||||
function syncPricingLinkedInputs(source) {
|
||||
const customPriceInput = document.getElementById('pricing-custom-price');
|
||||
const upliftInput = document.getElementById('pricing-uplift');
|
||||
if (!customPriceInput || !upliftInput) return;
|
||||
const estimateTotal = getPricingEstimateTotal();
|
||||
if (estimateTotal <= 0) {
|
||||
upliftInput.value = '';
|
||||
return;
|
||||
}
|
||||
if (source === 'price') {
|
||||
const customPrice = parseFloat(customPriceInput.value) || 0;
|
||||
upliftInput.value = customPrice > 0 ? formatUpliftInput(customPrice / estimateTotal) : '';
|
||||
return;
|
||||
}
|
||||
if (source === 'uplift') {
|
||||
const uplift = parseDecimalInput(upliftInput.value);
|
||||
customPriceInput.value = uplift > 0 ? (estimateTotal * uplift).toFixed(2) : '';
|
||||
}
|
||||
function _getPricingEstimateTotal(table) {
|
||||
const attr = table === 'sale' ? 'estSale' : 'est';
|
||||
const cls = table === 'sale' ? 'pricing-row-sale' : 'pricing-row-buy';
|
||||
let total = 0;
|
||||
document.querySelectorAll(`#pricing-body-${table} tr.${cls}`).forEach(tr => {
|
||||
total += parseFloat(tr.dataset[attr]) || 0;
|
||||
});
|
||||
return total;
|
||||
}
|
||||
|
||||
function onPricingUpliftInput() {
|
||||
syncPricingLinkedInputs('uplift');
|
||||
const customPrice = parseFloat(document.getElementById('pricing-custom-price').value) || 0;
|
||||
applyPricingCustomPrice(customPrice);
|
||||
}
|
||||
// Apply custom (own) price proportionally to Ручная цена column.
|
||||
// table: 'buy' | 'sale'
|
||||
function applyCustomPrice(table) {
|
||||
const inputId = `pricing-custom-price-${table}`;
|
||||
const totalElId = `pricing-total-${table}-vendor`;
|
||||
const rowClass = `pricing-row-${table}`;
|
||||
const cellClass = `.pricing-vendor-price-${table}`;
|
||||
const estAttr = table === 'sale' ? 'estSale' : 'est';
|
||||
const origAttr = table === 'buy' ? 'vendorOrig' : null;
|
||||
|
||||
function onPricingCustomPriceInput() {
|
||||
syncPricingLinkedInputs('price');
|
||||
const customPrice = parseFloat(document.getElementById('pricing-custom-price').value) || 0;
|
||||
applyPricingCustomPrice(customPrice);
|
||||
}
|
||||
const customPrice = parseFloat(document.getElementById(inputId)?.value) || 0;
|
||||
const estimateTotal = _getPricingEstimateTotal(table);
|
||||
const rows = document.querySelectorAll(`#pricing-body-${table} tr.${rowClass}`);
|
||||
const vendorCells = document.querySelectorAll(`#pricing-body-${table} ${cellClass}`);
|
||||
const totalVendorEl = document.getElementById(totalElId);
|
||||
|
||||
function applyPricingCustomPrice(customPrice) {
|
||||
const estimateTotal = getPricingEstimateTotal();
|
||||
const rows = document.querySelectorAll('#pricing-table-body tr.pricing-row');
|
||||
|
||||
const vendorCells = document.querySelectorAll('#pricing-table-body .pricing-vendor-price');
|
||||
const totalVendorEl = document.getElementById('pricing-total-vendor');
|
||||
const _pctLabel = (custom, est) => {
|
||||
if (est <= 0) return '';
|
||||
const pct = ((est - custom) / est * 100);
|
||||
const sign = pct >= 0 ? '-' : '+';
|
||||
return ` (${sign}${Math.abs(pct).toFixed(1)}%)`;
|
||||
};
|
||||
const _pctClass = (custom, est) => custom <= est ? 'text-green-600' : 'text-red-600';
|
||||
|
||||
if (customPrice > 0 && estimateTotal > 0) {
|
||||
// Proportionally redistribute custom price → Цена проектная cells
|
||||
let assigned = 0;
|
||||
rows.forEach((tr, i) => {
|
||||
const est = parseFloat(tr.dataset.est) || 0;
|
||||
const rowEst = parseFloat(tr.dataset[estAttr]) || 0;
|
||||
const qty = Math.max(1, parseFloat(tr.dataset.qty) || 1);
|
||||
const cell = vendorCells[i];
|
||||
if (!cell) return;
|
||||
let share;
|
||||
if (i === rows.length - 1) {
|
||||
share = customPrice - assigned;
|
||||
} else {
|
||||
share = Math.round((est / estimateTotal) * customPrice * 100) / 100;
|
||||
share = Math.round((rowEst / estimateTotal) * customPrice * 100) / 100;
|
||||
assigned += share;
|
||||
}
|
||||
cell.textContent = formatCurrency(share);
|
||||
cell.classList.add('text-blue-700');
|
||||
cell.classList.remove('text-gray-400');
|
||||
cell.textContent = formatCurrency(share / qty);
|
||||
cell.className = cell.className.replace(/\btext-(?:gray|green|red|blue)-\d+\b/g, '').trim();
|
||||
cell.classList.add(rowEst > 0 ? _pctClass(share, rowEst) : 'text-blue-700');
|
||||
});
|
||||
totalVendorEl.textContent = formatCurrency(customPrice);
|
||||
const pctStr = _pctLabel(customPrice, estimateTotal);
|
||||
totalVendorEl.textContent = formatCurrency(customPrice) + pctStr;
|
||||
totalVendorEl.className = totalVendorEl.className.replace(/\btext-(?:green|red)-\d+\b/g, '').trim();
|
||||
totalVendorEl.classList.add(_pctClass(customPrice, estimateTotal));
|
||||
} else {
|
||||
// Restore original vendor prices from BOM
|
||||
// Restore originals
|
||||
rows.forEach((tr, i) => {
|
||||
const cell = vendorCells[i];
|
||||
if (!cell) return;
|
||||
const orig = tr.dataset.vendorOrig;
|
||||
if (orig !== '') {
|
||||
cell.textContent = formatCurrency(parseFloat(orig));
|
||||
cell.classList.remove('text-blue-700', 'text-gray-400');
|
||||
cell.className = cell.className.replace(/\btext-(?:gray|green|red|blue)-\d+\b/g, '').trim();
|
||||
if (origAttr && tr.dataset.vendorOrigUnit !== '') {
|
||||
cell.textContent = formatCurrency(parseFloat(tr.dataset.vendorOrigUnit));
|
||||
} else {
|
||||
cell.textContent = '—';
|
||||
cell.classList.add('text-gray-400');
|
||||
cell.classList.remove('text-blue-700');
|
||||
}
|
||||
});
|
||||
// Recompute vendor total from originals
|
||||
let origTotal = 0; let hasOrig = false;
|
||||
rows.forEach(tr => { if (tr.dataset.vendorOrig !== '') { origTotal += parseFloat(tr.dataset.vendorOrig) || 0; hasOrig = true; } });
|
||||
totalVendorEl.textContent = hasOrig ? formatCurrency(origTotal) : '—';
|
||||
}
|
||||
|
||||
// Discount info
|
||||
const discountEl = document.getElementById('pricing-discount-info');
|
||||
const pctEl = document.getElementById('pricing-discount-pct');
|
||||
if (customPrice > 0 && estimateTotal > 0) {
|
||||
const discount = ((estimateTotal - customPrice) / estimateTotal * 100).toFixed(1);
|
||||
pctEl.textContent = discount + '%';
|
||||
discountEl.classList.remove('hidden');
|
||||
} else {
|
||||
discountEl.classList.add('hidden');
|
||||
// Recompute total from originals (buy) or clear (sale)
|
||||
if (origAttr) {
|
||||
let origTotal = 0; let hasOrig = false;
|
||||
rows.forEach(tr => { if (tr.dataset[origAttr] !== '') { origTotal += parseFloat(tr.dataset[origAttr]) || 0; hasOrig = true; } });
|
||||
totalVendorEl.textContent = hasOrig ? formatCurrency(origTotal) : '—';
|
||||
} else {
|
||||
// sale: reset to — already handled above
|
||||
totalVendorEl.textContent = '—';
|
||||
}
|
||||
totalVendorEl.className = totalVendorEl.className.replace(/\btext-(?:green|red)-\d+\b/g, '').trim();
|
||||
}
|
||||
}
|
||||
|
||||
function exportPricingCSV() {
|
||||
const rows = document.querySelectorAll('#pricing-table-body tr.pricing-row');
|
||||
function onBuyCustomPriceInput() {
|
||||
applyCustomPrice('buy');
|
||||
}
|
||||
|
||||
function onSaleCustomPriceInput() {
|
||||
applyCustomPrice('sale');
|
||||
}
|
||||
|
||||
function onSaleMarkupInput() {
|
||||
renderPricingTab();
|
||||
}
|
||||
|
||||
function setPricingCustomPriceFromVendor() {
|
||||
// Fill Ручная цена in Buy table from BOM vendor totals
|
||||
const rows = document.querySelectorAll('#pricing-body-buy tr.pricing-row-buy');
|
||||
const vendorCells = document.querySelectorAll('#pricing-body-buy .pricing-vendor-price-buy');
|
||||
let total = 0;
|
||||
let hasAny = false;
|
||||
rows.forEach((tr, i) => {
|
||||
const cell = vendorCells[i];
|
||||
if (!cell) return;
|
||||
const origUnit = tr.dataset.vendorOrigUnit;
|
||||
const origTotal = tr.dataset.vendorOrig;
|
||||
if (origUnit !== '') {
|
||||
cell.textContent = formatCurrency(parseFloat(origUnit));
|
||||
cell.className = cell.className.replace(/\btext-(?:gray|green|red|blue)-\d+\b/g, '').trim();
|
||||
total += parseFloat(origTotal) || 0;
|
||||
hasAny = true;
|
||||
} else {
|
||||
cell.textContent = '—';
|
||||
cell.className = cell.className.replace(/\btext-(?:green|red|blue)-\d+\b/g, '').trim();
|
||||
cell.classList.add('text-gray-400');
|
||||
}
|
||||
});
|
||||
const estimateTotal = _getPricingEstimateTotal('buy');
|
||||
const totalEl = document.getElementById('pricing-total-buy-vendor');
|
||||
if (hasAny) {
|
||||
document.getElementById('pricing-custom-price-buy').value = total.toFixed(2);
|
||||
const pct = estimateTotal > 0 ? ` (-${((estimateTotal - total) / estimateTotal * 100).toFixed(1)}%)` : '';
|
||||
totalEl.textContent = formatCurrency(total) + pct;
|
||||
totalEl.className = totalEl.className.replace(/\btext-(?:green|red)-\d+\b/g, '').trim();
|
||||
totalEl.classList.add(total <= estimateTotal ? 'text-green-600' : 'text-red-600');
|
||||
} else {
|
||||
document.getElementById('pricing-custom-price-buy').value = '';
|
||||
totalEl.textContent = '—';
|
||||
}
|
||||
}
|
||||
|
||||
function exportPricingCSV(table) {
|
||||
const bodyId = table === 'sale' ? 'pricing-body-sale' : 'pricing-body-buy';
|
||||
const rowClass = table === 'sale' ? 'pricing-row-sale' : 'pricing-row-buy';
|
||||
const totalIds = table === 'sale'
|
||||
? { est: 'pricing-total-sale-estimate', wh: 'pricing-total-sale-warehouse', comp: 'pricing-total-sale-competitor', vendor: 'pricing-total-sale-vendor' }
|
||||
: { est: 'pricing-total-buy-estimate', wh: 'pricing-total-buy-warehouse', comp: 'pricing-total-buy-competitor', vendor: 'pricing-total-buy-vendor' };
|
||||
|
||||
const rows = document.querySelectorAll(`#${bodyId} tr.${rowClass}`);
|
||||
if (!rows.length) { showToast('Нет данных для экспорта', 'error'); return; }
|
||||
|
||||
const csvDelimiter = ';';
|
||||
const cleanExportCell = value => {
|
||||
const text = String(value || '').replace(/\s+/g, ' ').trim();
|
||||
if (!text || text === '—') return text || '';
|
||||
return text
|
||||
.replace(/\s*\(.*\)$/, '')
|
||||
.replace(/\s*\*+\s*$/, '')
|
||||
.trim();
|
||||
};
|
||||
const csvEscape = v => {
|
||||
if (v == null) return '';
|
||||
const s = String(v).replace(/"/g, '""');
|
||||
return /[,"\n]/.test(s) ? `"${s}"` : s;
|
||||
return /[;"\n\r]/.test(s) ? `"${s}"` : s;
|
||||
};
|
||||
|
||||
const headers = ['Lot', 'P/N вендора', 'Описание', 'Кол-во', 'Цена проектная'];
|
||||
const lines = [headers.map(csvEscape).join(',')];
|
||||
const headers = ['Lot', 'PN вендора', 'Описание', 'Кол-во', 'Estimate', 'Склад', 'Конкуренты', 'Ручная цена'];
|
||||
const lines = [headers.map(csvEscape).join(csvDelimiter)];
|
||||
|
||||
rows.forEach(tr => {
|
||||
const cells = tr.querySelectorAll('td');
|
||||
const lot = cells[0] ? cells[0].textContent.trim() : '';
|
||||
const vendorPN = cells[1] ? cells[1].textContent.trim() : '';
|
||||
const description = cells[2] ? cells[2].textContent.trim() : '';
|
||||
const qty = cells[3] ? cells[3].textContent.trim() : '';
|
||||
const vendorPrice = cells[5] ? cells[5].textContent.trim() : '';
|
||||
lines.push([lot, vendorPN, description, qty, vendorPrice].map(csvEscape).join(','));
|
||||
const cols = [0,1,2,3,4,5,6,7].map(i => cells[i] ? cleanExportCell(cells[i].textContent) : '');
|
||||
lines.push(cols.map(csvEscape).join(csvDelimiter));
|
||||
});
|
||||
|
||||
// Totals row
|
||||
const vendorTotal = document.getElementById('pricing-total-vendor').textContent.trim();
|
||||
lines.push(['', '', '', 'Итого:', vendorTotal].map(csvEscape).join(','));
|
||||
const tEst = cleanExportCell(document.getElementById(totalIds.est)?.textContent);
|
||||
const tWh = cleanExportCell(document.getElementById(totalIds.wh)?.textContent);
|
||||
const tComp = cleanExportCell(document.getElementById(totalIds.comp)?.textContent);
|
||||
const tVendor = cleanExportCell(document.getElementById(totalIds.vendor)?.textContent);
|
||||
lines.push(['', '', '', 'Итого:', tEst, tWh, tComp, tVendor].map(csvEscape).join(csvDelimiter));
|
||||
|
||||
const blob = new Blob(['\uFEFF' + lines.join('\r\n')], {type: 'text/csv;charset=utf-8;'});
|
||||
const url = URL.createObjectURL(blob);
|
||||
const a = document.createElement('a');
|
||||
a.href = url;
|
||||
const today = new Date();
|
||||
const yyyy = today.getFullYear();
|
||||
const mm = String(today.getMonth() + 1).padStart(2, '0');
|
||||
const dd = String(today.getDate()).padStart(2, '0');
|
||||
const datePart = `${yyyy}-${mm}-${dd}`;
|
||||
const datePart = `${today.getFullYear()}-${String(today.getMonth()+1).padStart(2,'0')}-${String(today.getDate()).padStart(2,'0')}`;
|
||||
const codePart = (projectCode || 'NO-PROJECT').trim();
|
||||
const namePart = (configName || 'config').trim();
|
||||
a.download = `${datePart} (${codePart}) ${namePart} SPEC.csv`;
|
||||
const suffix = table === 'sale' ? 'SALE' : 'BUY';
|
||||
a.download = `${datePart} (${codePart}) ${namePart} SPEC-${suffix}.csv`;
|
||||
a.click();
|
||||
URL.revokeObjectURL(url);
|
||||
}
|
||||
|
||||
@@ -59,9 +59,8 @@
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase">Категория</th>
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase">Описание</th>
|
||||
<th id="th-qty" class="hidden px-6 py-3 text-right text-xs font-medium text-gray-500 uppercase">Доступно</th>
|
||||
<th id="th-partnumbers" class="hidden px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase">Partnumbers</th>
|
||||
<th class="px-6 py-3 text-right text-xs font-medium text-gray-500 uppercase">Цена, $</th>
|
||||
<th class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase">Настройки</th>
|
||||
<th id="th-settings" class="px-6 py-3 text-left text-xs font-medium text-gray-500 uppercase">Настройки</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="items-body" class="bg-white divide-y divide-gray-200">
|
||||
@@ -150,18 +149,23 @@
|
||||
}
|
||||
}
|
||||
|
||||
function isStockSource() {
|
||||
const src = (currentSource || '').toLowerCase();
|
||||
return src === 'warehouse' || src === 'competitor';
|
||||
}
|
||||
|
||||
function isWarehouseSource() {
|
||||
return (currentSource || '').toLowerCase() === 'warehouse';
|
||||
}
|
||||
|
||||
function itemsColspan() {
|
||||
return isWarehouseSource() ? 7 : 5;
|
||||
return isStockSource() ? 4 : 5;
|
||||
}
|
||||
|
||||
function toggleWarehouseColumns() {
|
||||
const visible = isWarehouseSource();
|
||||
document.getElementById('th-qty').classList.toggle('hidden', !visible);
|
||||
document.getElementById('th-partnumbers').classList.toggle('hidden', !visible);
|
||||
const stock = isStockSource();
|
||||
document.getElementById('th-qty').classList.toggle('hidden', true);
|
||||
document.getElementById('th-settings').classList.toggle('hidden', stock);
|
||||
}
|
||||
|
||||
function formatQty(qty) {
|
||||
@@ -234,27 +238,34 @@
|
||||
return;
|
||||
}
|
||||
|
||||
const showWarehouse = isWarehouseSource();
|
||||
const stock = isStockSource();
|
||||
const p = stock ? 'px-3 py-2' : 'px-6 py-3';
|
||||
const descMax = stock ? 30 : 60;
|
||||
|
||||
const html = items.map(item => {
|
||||
const price = item.price.toLocaleString('en-US', { minimumFractionDigits: 2, maximumFractionDigits: 2 });
|
||||
const description = item.lot_description || '-';
|
||||
const truncatedDesc = description.length > 60 ? description.substring(0, 60) + '...' : description;
|
||||
const qty = formatQty(item.available_qty);
|
||||
const partnumbers = Array.isArray(item.partnumbers) && item.partnumbers.length > 0 ? item.partnumbers.join(', ') : '—';
|
||||
const truncatedDesc = description.length > descMax ? description.substring(0, descMax) + '...' : description;
|
||||
|
||||
|
||||
|
||||
// Price cell — add spread badge for competitor
|
||||
let priceHtml = price;
|
||||
if (!isWarehouseSource() && item.price_spread_pct > 0) {
|
||||
priceHtml += ` <span class="text-xs text-amber-600 font-medium" title="Разброс цен конкурентов">±${item.price_spread_pct.toFixed(0)}%</span>`;
|
||||
}
|
||||
|
||||
return `
|
||||
<tr class="hover:bg-gray-50">
|
||||
<td class="px-6 py-3 whitespace-nowrap">
|
||||
<span class="font-mono text-sm">${item.lot_name}</span>
|
||||
<td class="${p} max-w-[160px]">
|
||||
<span class="font-mono text-sm break-all">${escapeHtml(item.lot_name)}</span>
|
||||
</td>
|
||||
<td class="px-6 py-3 whitespace-nowrap">
|
||||
<span class="px-2 py-1 text-xs bg-gray-100 rounded">${item.category || '-'}</span>
|
||||
<td class="${p} whitespace-nowrap">
|
||||
<span class="px-2 py-1 text-xs bg-gray-100 rounded">${escapeHtml(item.category || '-')}</span>
|
||||
</td>
|
||||
<td class="px-6 py-3 text-sm text-gray-500" title="${description}">${truncatedDesc}</td>
|
||||
${showWarehouse ? `<td class="px-6 py-3 whitespace-nowrap text-right font-mono">${qty}</td>` : ''}
|
||||
${showWarehouse ? `<td class="px-6 py-3 text-sm text-gray-600" title="${escapeHtml(partnumbers)}">${escapeHtml(partnumbers)}</td>` : ''}
|
||||
<td class="px-6 py-3 whitespace-nowrap text-right font-mono">${price}</td>
|
||||
<td class="px-6 py-3 whitespace-nowrap text-sm"><span class="text-xs bg-gray-100 px-2 py-1 rounded">${formatPriceSettings(item)}</span></td>
|
||||
<td class="${p} text-sm text-gray-500" title="${escapeHtml(description)}">${escapeHtml(truncatedDesc)}</td>
|
||||
<td class="${p} whitespace-nowrap text-right font-mono">${priceHtml}</td>
|
||||
${!stock ? `<td class="${p} whitespace-nowrap text-sm"><span class="text-xs bg-gray-100 px-2 py-1 rounded">${formatPriceSettings(item)}</span></td>` : ''}
|
||||
</tr>
|
||||
`;
|
||||
}).join('');
|
||||
|
||||
@@ -29,23 +29,26 @@
|
||||
<button onclick="openNewVariantModal()" class="inline-flex w-full sm:w-auto justify-center items-center px-3 py-1.5 text-sm font-medium bg-purple-600 text-white rounded-lg hover:bg-purple-700">
|
||||
+ Вариант
|
||||
</button>
|
||||
<button onclick="openVariantActionModal()" class="inline-flex w-full sm:w-auto justify-center items-center px-3 py-1.5 text-sm font-medium bg-indigo-600 text-white rounded-lg hover:bg-indigo-700">
|
||||
Действия с вариантом
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div id="action-buttons" class="mt-4 grid grid-cols-1 sm:grid-cols-6 gap-3">
|
||||
<button onclick="openCreateModal()" class="py-2 bg-blue-600 text-white rounded-lg hover:bg-blue-700 font-medium">
|
||||
Новая конфигурация
|
||||
+ Конфигурация
|
||||
</button>
|
||||
<button onclick="openVendorImportModal()" class="py-2 bg-amber-600 text-white rounded-lg hover:bg-amber-700 font-medium">
|
||||
Импорт выгрузки вендора
|
||||
</button>
|
||||
<button onclick="openProjectSettingsModal()" class="py-2 bg-gray-700 text-white rounded-lg hover:bg-gray-800 font-medium">
|
||||
Параметры
|
||||
Импорт
|
||||
</button>
|
||||
<button onclick="openExportModal()" class="py-2 bg-green-600 text-white rounded-lg hover:bg-green-700 font-medium">
|
||||
Экспорт CSV
|
||||
</button>
|
||||
<button onclick="openProjectSettingsModal()" class="py-2 bg-gray-700 text-white rounded-lg hover:bg-gray-800 font-medium">
|
||||
Параметры
|
||||
</button>
|
||||
<button id="delete-variant-btn" onclick="deleteVariant()" class="py-2 bg-red-600 text-white rounded-lg hover:bg-red-700 font-medium hidden">
|
||||
Удалить вариант
|
||||
</button>
|
||||
@@ -173,6 +176,34 @@
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div id="variant-action-modal" class="fixed inset-0 bg-black bg-opacity-50 hidden items-center justify-center z-50">
|
||||
<div class="bg-white rounded-lg shadow-xl w-full max-w-md mx-4 p-6">
|
||||
<h2 class="text-xl font-semibold mb-4">Действия с вариантом</h2>
|
||||
<div class="space-y-4">
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700 mb-1">Название</label>
|
||||
<input type="text" id="variant-action-name"
|
||||
class="w-full px-3 py-2 border rounded focus:ring-2 focus:ring-indigo-500 focus:border-indigo-500">
|
||||
</div>
|
||||
<label class="flex items-center gap-2 text-sm text-gray-700">
|
||||
<input type="checkbox" id="variant-action-copy" class="rounded border-gray-300">
|
||||
Создать копию
|
||||
</label>
|
||||
<div>
|
||||
<label class="block text-sm font-medium text-gray-700 mb-1">Код проекта</label>
|
||||
<input type="text" id="variant-action-code"
|
||||
class="w-full px-3 py-2 border rounded focus:ring-2 focus:ring-indigo-500 focus:border-indigo-500">
|
||||
</div>
|
||||
<input type="hidden" id="variant-action-current-name">
|
||||
<input type="hidden" id="variant-action-current-code">
|
||||
</div>
|
||||
<div class="flex justify-end space-x-3 mt-6">
|
||||
<button onclick="closeVariantActionModal()" class="px-4 py-2 text-gray-600 hover:text-gray-800">Отмена</button>
|
||||
<button onclick="saveVariantAction()" class="px-4 py-2 bg-indigo-600 text-white rounded hover:bg-indigo-700">Сохранить</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div id="config-action-modal" class="fixed inset-0 bg-black bg-opacity-50 hidden items-center justify-center z-50">
|
||||
<div class="bg-white rounded-lg shadow-xl w-full max-w-md mx-4 p-6">
|
||||
<h2 class="text-xl font-semibold mb-4">Действия с конфигурацией</h2>
|
||||
@@ -540,6 +571,213 @@ function closeNewVariantModal() {
|
||||
document.getElementById('new-variant-modal').classList.remove('flex');
|
||||
}
|
||||
|
||||
function openVariantActionModal() {
|
||||
if (!project) return;
|
||||
const currentName = (project.variant || '').trim();
|
||||
const currentCode = (project.code || '').trim();
|
||||
document.getElementById('variant-action-current-name').value = currentName;
|
||||
document.getElementById('variant-action-current-code').value = currentCode;
|
||||
document.getElementById('variant-action-name').value = currentName;
|
||||
document.getElementById('variant-action-code').value = currentCode;
|
||||
document.getElementById('variant-action-copy').checked = false;
|
||||
document.getElementById('variant-action-modal').classList.remove('hidden');
|
||||
document.getElementById('variant-action-modal').classList.add('flex');
|
||||
const nameInput = document.getElementById('variant-action-name');
|
||||
nameInput.focus();
|
||||
nameInput.select();
|
||||
}
|
||||
|
||||
function closeVariantActionModal() {
|
||||
document.getElementById('variant-action-modal').classList.add('hidden');
|
||||
document.getElementById('variant-action-modal').classList.remove('flex');
|
||||
}
|
||||
|
||||
function findUniqueVariantActionName(baseName, targetCode, excludeProjectUUID) {
|
||||
const cleanedBase = (baseName || '').trim();
|
||||
if (!cleanedBase || normalizeVariantLabel(cleanedBase).toLowerCase() === 'main') {
|
||||
return {error: 'Имя варианта не должно быть пустым и не может быть main'};
|
||||
}
|
||||
|
||||
const code = (targetCode || '').trim();
|
||||
const used = new Set(
|
||||
projectsCatalog
|
||||
.filter(p => (p.code || '').trim().toLowerCase() === code.toLowerCase())
|
||||
.filter(p => !excludeProjectUUID || p.uuid !== excludeProjectUUID)
|
||||
.map(p => ((p.variant || '').trim()).toLowerCase())
|
||||
);
|
||||
|
||||
if (!used.has(cleanedBase.toLowerCase())) {
|
||||
return {name: cleanedBase, changed: false};
|
||||
}
|
||||
|
||||
let candidate = cleanedBase + '_копия';
|
||||
let suffix = 2;
|
||||
while (used.has(candidate.toLowerCase())) {
|
||||
candidate = cleanedBase + '_копия' + suffix;
|
||||
suffix++;
|
||||
}
|
||||
return {name: candidate, changed: true};
|
||||
}
|
||||
|
||||
async function resolveUniqueConfigActionName(baseName, targetProjectUUID, excludeConfigUUID) {
|
||||
const cleanedBase = (baseName || '').trim();
|
||||
if (!cleanedBase) {
|
||||
return {error: 'Введите название'};
|
||||
}
|
||||
|
||||
let configs = [];
|
||||
if (targetProjectUUID === projectUUID) {
|
||||
configs = Array.isArray(allConfigs) ? allConfigs : [];
|
||||
} else {
|
||||
const resp = await fetch('/api/projects/' + targetProjectUUID + '/configs?status=all');
|
||||
if (!resp.ok) {
|
||||
return {error: 'Не удалось проверить конфигурации целевого проекта'};
|
||||
}
|
||||
const data = await resp.json().catch(() => ({}));
|
||||
configs = Array.isArray(data.configurations) ? data.configurations : [];
|
||||
}
|
||||
|
||||
const used = new Set(
|
||||
configs
|
||||
.filter(cfg => !excludeConfigUUID || cfg.uuid !== excludeConfigUUID)
|
||||
.map(cfg => (cfg.name || '').trim().toLowerCase())
|
||||
)
|
||||
|
||||
if (!used.has(cleanedBase.toLowerCase())) {
|
||||
return {name: cleanedBase, changed: false};
|
||||
}
|
||||
|
||||
let candidate = cleanedBase + '_копия';
|
||||
let suffix = 2;
|
||||
while (used.has(candidate.toLowerCase())) {
|
||||
candidate = cleanedBase + '_копия' + suffix;
|
||||
suffix++;
|
||||
}
|
||||
return {name: candidate, changed: true};
|
||||
}
|
||||
|
||||
async function cloneVariantConfigurations(targetProjectUUID) {
|
||||
const listResp = await fetch('/api/projects/' + projectUUID + '/configs');
|
||||
if (!listResp.ok) {
|
||||
throw new Error('Не удалось загрузить конфигурации варианта');
|
||||
}
|
||||
const listData = await listResp.json().catch(() => ({}));
|
||||
const configs = Array.isArray(listData.configurations) ? listData.configurations : [];
|
||||
for (const cfg of configs) {
|
||||
const cloneResp = await fetch('/api/projects/' + targetProjectUUID + '/configs/' + cfg.uuid + '/clone', {
|
||||
method: 'POST',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify({name: cfg.name})
|
||||
});
|
||||
if (!cloneResp.ok) {
|
||||
throw new Error('Не удалось скопировать конфигурацию «' + (cfg.name || 'без названия') + '»');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function saveVariantAction() {
|
||||
if (!project) return;
|
||||
const notify = (message, type) => {
|
||||
if (typeof showToast === 'function') {
|
||||
showToast(message, type || 'success');
|
||||
} else {
|
||||
alert(message);
|
||||
}
|
||||
};
|
||||
|
||||
const currentName = document.getElementById('variant-action-current-name').value.trim();
|
||||
const currentCode = document.getElementById('variant-action-current-code').value.trim();
|
||||
const rawName = document.getElementById('variant-action-name').value.trim();
|
||||
const code = document.getElementById('variant-action-code').value.trim();
|
||||
const copy = document.getElementById('variant-action-copy').checked;
|
||||
|
||||
if (!code) {
|
||||
notify('Введите код проекта', 'error');
|
||||
return;
|
||||
}
|
||||
const uniqueNameResult = findUniqueVariantActionName(rawName, code, copy ? '' : projectUUID);
|
||||
if (uniqueNameResult.error) {
|
||||
notify(uniqueNameResult.error, 'error');
|
||||
return;
|
||||
}
|
||||
const name = uniqueNameResult.name;
|
||||
if (uniqueNameResult.changed) {
|
||||
document.getElementById('variant-action-name').value = name;
|
||||
notify('Имя варианта занято, использовано ' + name, 'success');
|
||||
}
|
||||
|
||||
if (copy) {
|
||||
const createResp = await fetch('/api/projects', {
|
||||
method: 'POST',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify({
|
||||
code: code,
|
||||
variant: name,
|
||||
name: project.name || null,
|
||||
tracker_url: (project.tracker_url || '').trim()
|
||||
})
|
||||
});
|
||||
if (!createResp.ok) {
|
||||
if (createResp.status === 400) {
|
||||
notify('Имя варианта не может быть main', 'error');
|
||||
return;
|
||||
}
|
||||
if (createResp.status === 409) {
|
||||
notify('Вариант с таким кодом и значением уже существует', 'error');
|
||||
return;
|
||||
}
|
||||
notify('Не удалось создать копию варианта', 'error');
|
||||
return;
|
||||
}
|
||||
const created = await createResp.json().catch(() => null);
|
||||
if (!created || !created.uuid) {
|
||||
notify('Не удалось создать копию варианта', 'error');
|
||||
return;
|
||||
}
|
||||
try {
|
||||
await cloneVariantConfigurations(created.uuid);
|
||||
} catch (err) {
|
||||
notify(err.message || 'Вариант создан, но конфигурации не скопированы полностью', 'error');
|
||||
window.location.href = '/projects/' + created.uuid;
|
||||
return;
|
||||
}
|
||||
closeVariantActionModal();
|
||||
notify('Копия варианта создана', 'success');
|
||||
window.location.href = '/projects/' + created.uuid;
|
||||
return;
|
||||
}
|
||||
|
||||
const changed = name !== currentName || code !== currentCode;
|
||||
if (!changed) {
|
||||
closeVariantActionModal();
|
||||
return;
|
||||
}
|
||||
|
||||
const updateResp = await fetch('/api/projects/' + projectUUID, {
|
||||
method: 'PUT',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify({code: code, variant: name})
|
||||
});
|
||||
if (!updateResp.ok) {
|
||||
if (updateResp.status === 400) {
|
||||
notify('Имя варианта не может быть main', 'error');
|
||||
return;
|
||||
}
|
||||
if (updateResp.status === 409) {
|
||||
notify('Вариант с таким кодом и значением уже существует', 'error');
|
||||
return;
|
||||
}
|
||||
notify('Не удалось сохранить вариант', 'error');
|
||||
return;
|
||||
}
|
||||
|
||||
closeVariantActionModal();
|
||||
await loadProject();
|
||||
await loadConfigs();
|
||||
updateDeleteVariantButton();
|
||||
notify('Вариант обновлён', 'success');
|
||||
}
|
||||
|
||||
async function createNewVariant() {
|
||||
if (!project) return;
|
||||
const code = (project.code || '').trim();
|
||||
@@ -864,12 +1102,22 @@ async function saveConfigAction() {
|
||||
notify('Введите название', 'error');
|
||||
return;
|
||||
}
|
||||
const uniqueNameResult = await resolveUniqueConfigActionName(name, targetProjectUUID, copy ? '' : uuid);
|
||||
if (uniqueNameResult.error) {
|
||||
notify(uniqueNameResult.error, 'error');
|
||||
return;
|
||||
}
|
||||
const resolvedName = uniqueNameResult.name;
|
||||
if (uniqueNameResult.changed) {
|
||||
document.getElementById('config-action-name').value = resolvedName;
|
||||
notify('Имя занято, использовано ' + resolvedName, 'success');
|
||||
}
|
||||
|
||||
if (copy) {
|
||||
const cloneResp = await fetch('/api/projects/' + targetProjectUUID + '/configs/' + uuid + '/clone', {
|
||||
method: 'POST',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify({name: name})
|
||||
body: JSON.stringify({name: resolvedName})
|
||||
});
|
||||
if (!cloneResp.ok) {
|
||||
notify('Не удалось скопировать конфигурацию', 'error');
|
||||
@@ -886,11 +1134,11 @@ async function saveConfigAction() {
|
||||
}
|
||||
|
||||
let changed = false;
|
||||
if (name !== currentName) {
|
||||
if (resolvedName !== currentName) {
|
||||
const renameResp = await fetch('/api/configs/' + uuid + '/rename', {
|
||||
method: 'PATCH',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify({name: name})
|
||||
body: JSON.stringify({name: resolvedName})
|
||||
});
|
||||
if (!renameResp.ok) {
|
||||
notify('Не удалось переименовать конфигурацию', 'error');
|
||||
@@ -1016,6 +1264,7 @@ function updateDeleteVariantButton() {
|
||||
document.getElementById('create-modal').addEventListener('click', function(e) { if (e.target === this) closeCreateModal(); });
|
||||
document.getElementById('vendor-import-modal').addEventListener('click', function(e) { if (e.target === this) closeVendorImportModal(); });
|
||||
document.getElementById('new-variant-modal').addEventListener('click', function(e) { if (e.target === this) closeNewVariantModal(); });
|
||||
document.getElementById('variant-action-modal').addEventListener('click', function(e) { if (e.target === this) closeVariantActionModal(); });
|
||||
document.getElementById('config-action-modal').addEventListener('click', function(e) { if (e.target === this) closeConfigActionModal(); });
|
||||
document.getElementById('project-settings-modal').addEventListener('click', function(e) { if (e.target === this) closeProjectSettingsModal(); });
|
||||
document.getElementById('config-action-project-input').addEventListener('input', function(e) {
|
||||
@@ -1026,7 +1275,7 @@ document.getElementById('config-action-copy').addEventListener('change', functio
|
||||
const currentName = document.getElementById('config-action-current-name').value;
|
||||
const nameInput = document.getElementById('config-action-name');
|
||||
if (e.target.checked && nameInput.value.trim() === currentName.trim()) {
|
||||
nameInput.value = currentName + ' (копия)';
|
||||
nameInput.value = currentName + '_копия';
|
||||
}
|
||||
syncActionModalMode();
|
||||
});
|
||||
@@ -1034,6 +1283,7 @@ document.addEventListener('keydown', function(e) {
|
||||
if (e.key === 'Escape') {
|
||||
closeCreateModal();
|
||||
closeVendorImportModal();
|
||||
closeVariantActionModal();
|
||||
closeConfigActionModal();
|
||||
closeProjectSettingsModal();
|
||||
}
|
||||
|
||||
@@ -64,7 +64,7 @@ let status = 'active';
|
||||
let projectsSearch = '';
|
||||
let authorSearch = '';
|
||||
let currentPage = 1;
|
||||
let perPage = 10;
|
||||
let perPage = 33;
|
||||
let sortField = 'created_at';
|
||||
let sortDir = 'desc';
|
||||
let createProjectTrackerManuallyEdited = false;
|
||||
@@ -114,21 +114,21 @@ function formatDateParts(value) {
|
||||
};
|
||||
}
|
||||
|
||||
function renderAuditCell(value, user) {
|
||||
const parts = formatDateParts(value);
|
||||
const safeUser = escapeHtml((user || '—').trim() || '—');
|
||||
if (!parts) {
|
||||
return '<div class="leading-tight">' +
|
||||
'<div class="text-gray-400">—</div>' +
|
||||
'<div class="text-gray-400">—</div>' +
|
||||
'<div class="text-gray-500">@ ' + safeUser + '</div>' +
|
||||
'</div>';
|
||||
}
|
||||
return '<div class="leading-tight whitespace-nowrap">' +
|
||||
'<div>' + escapeHtml(parts.date) + '</div>' +
|
||||
'<div class="text-gray-500">' + escapeHtml(parts.time) + '</div>' +
|
||||
'<div class="text-gray-600">@ ' + safeUser + '</div>' +
|
||||
'</div>';
|
||||
function formatISODate(value) {
|
||||
if (!value) return '—';
|
||||
const date = new Date(value);
|
||||
if (Number.isNaN(date.getTime())) return '—';
|
||||
return date.toISOString().slice(0, 10);
|
||||
}
|
||||
|
||||
function renderProjectDateCell(project) {
|
||||
const updatedDate = formatISODate(project && project.updated_at);
|
||||
const tooltip = [
|
||||
'Создан: ' + formatDateTime(project && project.created_at),
|
||||
'Изменен: ' + formatDateTime(project && project.updated_at),
|
||||
'Автор: ' + ((project && project.owner_username) || '—')
|
||||
].join('\n');
|
||||
return '<div class="whitespace-nowrap text-gray-600 cursor-help" title="' + escapeHtml(tooltip) + '">' + escapeHtml(updatedDate) + '</div>';
|
||||
}
|
||||
|
||||
function normalizeVariant(variant) {
|
||||
@@ -141,11 +141,11 @@ function renderVariantChips(code, fallbackVariant, fallbackUUID) {
|
||||
if (!variants.length) {
|
||||
const single = normalizeVariant(fallbackVariant);
|
||||
const href = fallbackUUID ? ('/projects/' + fallbackUUID) : '/projects';
|
||||
return '<a href="' + href + '" class="inline-flex items-center px-2 py-0.5 text-xs rounded-full bg-gray-100 text-gray-600 hover:bg-gray-200 hover:text-gray-900">' + escapeHtml(single) + '</a>';
|
||||
return '<a href="' + href + '" class="inline-flex items-center px-1.5 py-px text-xs leading-5 rounded-full bg-gray-100 text-gray-600 hover:bg-gray-200 hover:text-gray-900">' + escapeHtml(single) + '</a>';
|
||||
}
|
||||
return variants.map(v => {
|
||||
const href = v.uuid ? ('/projects/' + v.uuid) : '/projects';
|
||||
return '<a href="' + href + '" class="inline-flex items-center px-2 py-0.5 text-xs rounded-full bg-gray-100 text-gray-700 hover:bg-gray-200 hover:text-gray-900">' + escapeHtml(v.label) + '</a>';
|
||||
return '<a href="' + href + '" class="inline-flex items-center px-1.5 py-px text-xs leading-5 rounded-full bg-gray-100 text-gray-700 hover:bg-gray-200 hover:text-gray-900">' + escapeHtml(v.label) + '</a>';
|
||||
}).join(' ');
|
||||
}
|
||||
|
||||
@@ -262,25 +262,25 @@ async function loadProjects() {
|
||||
let html = '<div class="overflow-x-auto"><table class="w-full table-fixed min-w-[980px]">';
|
||||
html += '<thead class="bg-gray-50">';
|
||||
html += '<tr>';
|
||||
html += '<th class="w-28 px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">Код</th>';
|
||||
html += '<th class="w-28 px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">Дата</th>';
|
||||
html += '<th class="w-32 px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">Код</th>';
|
||||
html += '<th class="px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">';
|
||||
html += '<button type="button" onclick="toggleSort(\'name\')" class="inline-flex items-center gap-1 hover:text-gray-700">Название';
|
||||
if (sortField === 'name') {
|
||||
html += sortDir === 'asc' ? ' <span>↑</span>' : ' <span>↓</span>';
|
||||
}
|
||||
html += '</button></th>';
|
||||
html += '<th class="w-44 px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">Создан</th>';
|
||||
html += '<th class="w-44 px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">Изменен</th>';
|
||||
html += '<th class="w-36 px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">Варианты</th>';
|
||||
html += '<th class="w-36 px-4 py-3 text-right text-xs font-medium text-gray-500 uppercase">Действия</th>';
|
||||
html += '<th class="w-24 px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">Автор</th>';
|
||||
html += '<th class="w-56 px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase">Варианты</th>';
|
||||
html += '<th class="w-14 px-2 py-3 text-right text-xs font-medium text-gray-500 uppercase"></th>';
|
||||
html += '</tr>';
|
||||
html += '<tr>';
|
||||
html += '<th class="px-4 py-2"></th>';
|
||||
html += '<th class="px-2 py-2"></th>';
|
||||
html += '<th class="px-4 py-2"></th>';
|
||||
html += '<th class="px-4 py-2"><input id="projects-author-filter" type="text" value="' + escapeHtml(authorSearch) + '" placeholder="Фильтр автора" class="w-full px-2 py-1 border rounded text-xs focus:ring-1 focus:ring-blue-500 focus:border-blue-500"></th>';
|
||||
html += '<th class="px-4 py-2"></th>';
|
||||
html += '<th class="px-4 py-2"></th>';
|
||||
html += '<th class="px-4 py-2"></th>';
|
||||
html += '<th class="px-2 py-2"></th>';
|
||||
html += '</tr>';
|
||||
html += '</thead><tbody class="divide-y">';
|
||||
|
||||
@@ -292,36 +292,21 @@ async function loadProjects() {
|
||||
html += '<tr class="hover:bg-gray-50">';
|
||||
const displayName = p.name || '';
|
||||
const createdBy = p.owner_username || '—';
|
||||
const updatedBy = '—';
|
||||
const variantChips = renderVariantChips(p.code, p.variant, p.uuid);
|
||||
html += '<td class="px-4 py-3 text-sm font-medium align-top"><a class="inline-block max-w-full text-blue-600 hover:underline whitespace-nowrap" href="/projects/' + p.uuid + '">' + escapeHtml(p.code || '—') + '</a></td>';
|
||||
html += '<td class="px-4 py-3 text-sm text-gray-700 align-top"><div class="truncate" title="' + escapeHtml(displayName) + '">' + escapeHtml(displayName || '—') + '</div></td>';
|
||||
html += '<td class="px-4 py-3 text-sm text-gray-600 align-top">' + renderAuditCell(p.created_at, createdBy) + '</td>';
|
||||
html += '<td class="px-4 py-3 text-sm text-gray-600 align-top">' + renderAuditCell(p.updated_at, updatedBy) + '</td>';
|
||||
html += '<td class="px-4 py-3 text-sm align-top">' + renderProjectDateCell(p) + '</td>';
|
||||
html += '<td class="px-4 py-3 text-sm font-medium align-top break-words"><a class="inline text-blue-600 hover:underline break-all whitespace-normal" href="/projects/' + p.uuid + '">' + escapeHtml(p.code || '—') + '</a></td>';
|
||||
html += '<td class="px-4 py-3 text-sm text-gray-700 align-top break-words"><div class="whitespace-normal break-words" title="' + escapeHtml(displayName) + '">' + escapeHtml(displayName || '—') + '</div></td>';
|
||||
html += '<td class="px-4 py-3 text-sm text-gray-600 align-top whitespace-nowrap">' + escapeHtml(createdBy) + '</td>';
|
||||
html += '<td class="px-4 py-3 text-sm align-top"><div class="flex flex-wrap gap-1">' + variantChips + '</div></td>';
|
||||
html += '<td class="px-4 py-3 text-sm text-right"><div class="inline-flex items-center gap-2">';
|
||||
html += '<td class="px-2 py-3 text-sm text-right"><div class="inline-flex items-center justify-end gap-2">';
|
||||
|
||||
if (p.is_active) {
|
||||
const safeName = escapeHtml(displayName).replace(/'/g, "\\'");
|
||||
html += '<button onclick="copyProject(' + JSON.stringify(p.uuid) + ', ' + JSON.stringify(displayName) + ')" class="text-green-700 hover:text-green-900" title="Копировать">';
|
||||
html += '<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 16H6a2 2 0 01-2-2V6a2 2 0 012-2h8a2 2 0 012 2v2m-6 12h8a2 2 0 002-2v-8a2 2 0 00-2-2h-8a2 2 0 00-2 2v8a2 2 0 002 2z"></path></svg>';
|
||||
html += '</button>';
|
||||
|
||||
html += '<button onclick="renameProject(' + JSON.stringify(p.uuid) + ', ' + JSON.stringify(displayName) + ')" class="text-blue-700 hover:text-blue-900" title="Переименовать">';
|
||||
html += '<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M11 5H6a2 2 0 00-2 2v11a2 2 0 002 2h11a2 2 0 002-2v-5m-1.414-9.414a2 2 0 112.828 2.828L11.828 15H9v-2.828l8.586-8.586z"></path></svg>';
|
||||
html += '</button>';
|
||||
|
||||
if ((p.tracker_url || '').trim() !== '') {
|
||||
html += '<a href="' + escapeHtml(p.tracker_url) + '" target="_blank" rel="noopener noreferrer" class="inline-flex items-center justify-center w-5 h-5 text-sky-700 hover:text-sky-900 font-semibold" title="Открыть в трекере">T</a>';
|
||||
}
|
||||
html += '<button onclick="archiveProject(\'' + p.uuid + '\')" class="text-red-700 hover:text-red-900" title="Удалить (в архив)">';
|
||||
html += '<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16"></path></svg>';
|
||||
html += '</button>';
|
||||
|
||||
html += '<button onclick="addConfigToProject(\'' + p.uuid + '\')" class="text-indigo-700 hover:text-indigo-900" title="Добавить конфигурацию">';
|
||||
html += '<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 4v16m8-8H4"></path></svg>';
|
||||
html += '</button>';
|
||||
} else {
|
||||
html += '<button onclick="reactivateProject(\'' + p.uuid + '\')" class="text-emerald-700 hover:text-emerald-900" title="Восстановить">';
|
||||
html += '<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M5 13l4 4L19 7"></path></svg>';
|
||||
html += '</button>';
|
||||
}
|
||||
html += '</div></td>';
|
||||
html += '</tr>';
|
||||
|
||||
Reference in New Issue
Block a user