17 Commits

Author SHA1 Message Date
Mikhail Chusavitin
99fd80bca7 feat: unify sync functionality with event-driven UI updates
- Refactored navbar sync button to dispatch 'sync-completed' event
- Configs page: removed duplicate 'Импорт с сервера' button, added auto-refresh on sync
- Projects page: wrapped initialization in DOMContentLoaded, added auto-refresh on sync
- Pricelists page: added auto-refresh on sync completion
- Consistent UX: all lists update automatically after 'Синхронизация' button click
- Removed code duplication: importConfigsFromServer() function no longer needed
- Event-driven architecture enables easy extension to other pages

Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
2026-02-10 11:11:10 +03:00
Mikhail Chusavitin
d8edd5d5f0 chore: exclude qfs binary and update release notes for v1.2.2
- Add qfs binary to gitignore (compiled executable from build)
- Update UI labels in configuration form for clarity
- Add release notes documenting v1.2.2 changes

Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
2026-02-09 17:50:58 +03:00
Mikhail Chusavitin
9cb17ee03f chore: simplify gitignore rules for releases binaries
- Ignore all files in releases/ directory (binaries, archives, checksums)
- Preserve releases/memory/ for changelog tracking
- Changed from 'releases/' to 'releases/*' for clearer intent

Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
2026-02-09 17:41:41 +03:00
Mikhail Chusavitin
8f596cec68 fix: standardize CSV export filename format to use project name
Unified export filename format across both ExportCSV and ExportConfigCSV:
- Format: YYYY-MM-DD (project_name) config_name BOM.csv
- Use PriceUpdatedAt if available, otherwise CreatedAt
- Extract project name from ProjectUUID for ExportCSV via projectService
- Pass project_uuid from frontend to backend in export request
- Add projectUUID and projectName state variables to track project context

This ensures consistent naming whether exporting from form or project view,
and uses most recent price update timestamp in filename.

Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
2026-02-09 17:22:51 +03:00
Mikhail Chusavitin
8fd27d11a7 docs: update v1.2.1 release notes with full changelog
Added comprehensive release notes including:
- Summary of the v1.2.1 patch release
- Bug fix details for configurator component substitution
- API price loading implementation
- Testing verification
- Installation instructions for all platforms
- Migration notes (no DB migration required)

Release notes now provide full context for end users and developers.

Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
2026-02-09 15:45:00 +03:00
Mikhail Chusavitin
600f842b82 docs: add releases/memory directory for changelog tracking
Added structured changelog documentation:
- Created releases/memory/ directory to track changes between tags
- Each version has a .md file (v1.2.1.md, etc.) documenting commits and impact
- Updated CLAUDE.md with release notes reference
- Updated README.md with releases section
- Updated .gitignore to track releases/memory/ while ignoring other release artifacts

This helps reviewers and developers understand changes between versions
before making new updates to the codebase.

Initial entry: v1.2.1.md documenting the pricelist refactor and
configurator component substitution fix.

Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
2026-02-09 15:40:23 +03:00
Mikhail Chusavitin
acf7c8a4da fix: load component prices via API instead of removed current_price field
After the recent refactor that removed CurrentPrice from local_components,
the configurator's autocomplete was filtering out all components because
it checked for the now-removed current_price field.

Instead, now load prices from the API when the user starts typing in a
component search field:
- Added ensurePricesLoaded() to fetch prices via /api/quote/price-levels
- Added componentPricesCache to store loaded prices
- Updated all 3 autocomplete modes (single, multi, section) to load prices
- Changed price checks from c.current_price to hasComponentPrice()
- Updated cart item creation to use cached prices

Components without prices are still filtered out as required, but the check
now uses API data rather than a removed database field.

Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
2026-02-09 15:31:53 +03:00
Mikhail Chusavitin
5984a57a8b refactor: remove CurrentPrice from local_components and transition to pricelist-based pricing
## Overview
Removed the CurrentPrice and SyncedAt fields from local_components, transitioning to a
pricelist-based pricing model where all prices are sourced from local_pricelist_items
based on the configuration's selected pricelist.

## Changes

### Data Model Updates
- **LocalComponent**: Now stores only metadata (LotName, LotDescription, Category, Model)
  - Removed: CurrentPrice, SyncedAt (both redundant)
  - Pricing is now exclusively sourced from local_pricelist_items

- **LocalConfiguration**: Added pricelist selection fields
  - Added: WarehousePricelistID, CompetitorPricelistID
  - These complement the existing PricelistID (Estimate)

### Migrations
- Added migration "drop_component_unused_fields" to remove CurrentPrice and SyncedAt columns
- Added migration "add_warehouse_competitor_pricelists" to add new pricelist fields

### Component Sync
- Removed current_price from MariaDB query
- Removed CurrentPrice assignment in component creation
- SyncComponentPrices now exclusively updates based on pricelist_items via quote calculation

### Quote Calculation
- Added PricelistID field to QuoteRequest
- Updated local-first path to use pricelist_items instead of component.CurrentPrice
- Falls back to latest estimate pricelist if PricelistID not specified
- Maintains offline-first behavior: local queries work without MariaDB

### Configuration Refresh
- Removed fallback on component.CurrentPrice
- Prices are only refreshed from local_pricelist_items
- If price not found in pricelist, original price is preserved

### API Changes
- Removed CurrentPrice from ComponentView
- Components API no longer returns pricing information
- Pricing is accessed via QuoteService or PricelistService

### Code Cleanup
- Removed UpdateComponentPricesFromPricelist() method
- Removed EnsureComponentPricesFromPricelists() method
- Updated UnifiedRepository to remove offline pricing logic
- Updated converters to remove CurrentPrice mapping

## Architecture Impact
- Components = metadata store only
- Prices = managed by pricelist system
- Quote calculation = owns all pricing logic
- Local-first behavior preserved: SQLite queries work offline, no MariaDB dependency

## Testing
- Build successful
- All code compiles without errors
- Ready for migration testing with existing databases

Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
2026-02-09 14:54:02 +03:00
Mikhail Chusavitin
84dda8cf0a docs: document complete database user permissions for sync support
Add comprehensive database permissions documentation:
- Full list of required tables with their purpose
- Separate sections for: existing user grants, new user creation, and important notes
- Clarifies that sync tables (qt_client_local_migrations, qt_client_schema_state,
  qt_pricelist_sync_status) must be created by DB admin - app doesn't need CREATE TABLE
- Explains read-only vs read-write permissions for each table
- Uses placeholder '<DB_USER>' instead of hardcoded usernames

This helps administrators set up proper permissions without CREATE TABLE requirements,
fixing the sync blockage issue in v1.1.0.

Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
2026-02-09 11:30:09 +03:00
Mikhail Chusavitin
abeb26d82d fix: handle database permission issues in sync migration verification
Sync was blocked because the migration registry table creation required
CREATE TABLE permissions that the database user might not have.

Changes:
- Check if migration registry tables exist before attempting to create them
- Skip creation if table exists and user lacks CREATE permissions
- Use information_schema to reliably check table existence
- Apply same fix to user sync status table creation
- Gracefully handle ALTER TABLE failures for backward compatibility

This allows sync to proceed even if the client is a read-limited database user,
as long as the required tables have already been created by an administrator.

Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
2026-02-09 11:22:33 +03:00
Mikhail Chusavitin
29edd73744 projects: add /all endpoint for unlimited project list
Solve pagination issue where configs reference projects not in the
paginated list (default 10 items, but there could be 50+ projects).

Changes:
- Add GET /api/projects/all endpoint that returns ALL projects without
  pagination as simple {uuid, name} objects
- Update frontend loadProjectsForConfigUI() to use /api/projects/all
  instead of /api/projects?status=all
- Ensures all projects are available in projectNameByUUID for config
  display, regardless of total project count

This fixes cases where project names don't display in /configs page
for configs that reference projects outside the paginated range.

Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
2026-02-09 11:19:49 +03:00
Mikhail Chusavitin
e8d0e28415 export: add project name to CSV filename format
Update filename format to include both project and quotation names:
  YYYY-MM-DD (PROJECT-NAME) QUOTATION-NAME BOM.csv

Changes:
- Add ProjectName field to ExportRequest (optional)
- Update ExportCSV: use project_name if provided, otherwise fall back to name
- Update ExportConfigCSV: use config name for both project and quotation

Example filenames:
  2026-02-09 (OPS-1957) config1 BOM.csv
  2026-02-09 (MyProject) MyQuotation BOM.csv

Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
2026-02-09 11:02:36 +03:00
Mikhail Chusavitin
08feda9af6 export: use filename from Content-Disposition header in browser
Fix issue where frontend was ignoring server's Content-Disposition
header and using only config name + '.csv' for exported files.

Added getFilenameFromResponse() helper to extract proper filename
from Content-Disposition header and use it for downloaded files.

Applied to both:
- exportCSV() function
- exportCSVWithCustomPrice() function

Now files are downloaded with correct format:
  YYYY-MM-DD (PROJECT-NAME) BOM.csv

Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
2026-02-09 10:58:01 +03:00
Mikhail Chusavitin
af79b6f3bf export: update CSV filename format to YYYY-MM-DD (PROJECT-NAME) BOM
Change exported CSV filename format from:
  YYYY-MM-DD NAME SPEC.csv
To:
  YYYY-MM-DD (NAME) BOM.csv

Applied to both:
- POST /api/export/csv (direct export)
- GET /api/configs/:uuid/export (config export)

All tests passing.

Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
2026-02-09 10:49:56 +03:00
Mikhail Chusavitin
bca82f9dc0 export: implement streaming CSV with Excel compatibility
Implement Phase 1 CSV Export Optimization:
- Replace buffering with true HTTP streaming (ToCSV writes to io.Writer)
- Add UTF-8 BOM (0xEF 0xBB 0xBF) for correct Cyrillic display in Excel
- Use semicolon (;) delimiter for Russian Excel locale
- Use comma (,) as decimal separator in numbers (100,50 instead of 100.50)
- Add graceful two-phase error handling:
  * Before streaming: return JSON errors for validation failures
  * During streaming: log errors only (HTTP 200 already sent)
- Add backward-compatible ToCSVBytes() helper
- Add GET /api/configs/:uuid/export route for configuration export

New tests (13 total):
- Service layer (7 tests):
  * UTF-8 BOM verification
  * Semicolon delimiter parsing
  * Total row formatting
  * Category sorting
  * Empty data handling
  * Backward compatibility wrapper
  * Writer error handling
- Handler layer (6 tests):
  * Successful CSV export with streaming
  * Invalid request validation
  * Empty items validation
  * Config export with proper headers
  * 404 for missing configs
  * Empty config validation

All tests passing, build verified.

Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
2026-02-09 10:47:10 +03:00
17969277e6 pricing: enrich pricelist items with stock and tighten CORS 2026-02-08 10:27:36 +03:00
0dbfe45353 security: harden secret hygiene and pre-commit scanning 2026-02-08 10:27:23 +03:00
34 changed files with 1978 additions and 390 deletions

5
.githooks/pre-commit Executable file
View File

@@ -0,0 +1,5 @@
#!/usr/bin/env bash
set -euo pipefail
repo_root="$(git rev-parse --show-toplevel)"
"$repo_root/scripts/check-secrets.sh"

17
.gitignore vendored
View File

@@ -1,5 +1,16 @@
# QuoteForge # QuoteForge
config.yaml config.yaml
.env
.env.*
*.pem
*.key
*.p12
*.pfx
*.crt
id_rsa
id_rsa.*
secrets.yaml
secrets.yml
# Local SQLite database (contains encrypted credentials) # Local SQLite database (contains encrypted credentials)
/data/*.db /data/*.db
@@ -12,6 +23,7 @@ config.yaml
/importer /importer
/cron /cron
/bin/ /bin/
qfs
# Local Go build cache used in sandboxed runs # Local Go build cache used in sandboxed runs
.gocache/ .gocache/
@@ -63,4 +75,7 @@ Network Trash Folder
Temporary Items Temporary Items
.apdisk .apdisk
releases/ # Release artifacts (binaries, archives, checksums), but DO track releases/memory/ for changelog
releases/*
!releases/memory/
!releases/memory/**

View File

@@ -56,6 +56,12 @@
- `/pricelists/:id` - `/pricelists/:id`
- `/setup` - `/setup`
## Release Notes & Change Log
Release notes are maintained in `releases/memory/` directory organized by version tags (e.g., `v1.2.1.md`).
Before working on the codebase, review the most recent release notes to understand recent changes.
- Check `releases/memory/` for detailed changelog between tags
- Each release file documents commits, breaking changes, and migration notes
## Commands ## Commands
```bash ```bash
# Development # Development

View File

@@ -1,4 +1,4 @@
.PHONY: build build-release clean test run version .PHONY: build build-release clean test run version install-hooks
# Get version from git # Get version from git
VERSION := $(shell git describe --tags --always --dirty 2>/dev/null || echo "dev") VERSION := $(shell git describe --tags --always --dirty 2>/dev/null || echo "dev")
@@ -72,6 +72,12 @@ deps:
go mod download go mod download
go mod tidy go mod tidy
# Install local git hooks
install-hooks:
git config core.hooksPath .githooks
chmod +x .githooks/pre-commit scripts/check-secrets.sh
@echo "Installed git hooks from .githooks/"
# Help # Help
help: help:
@echo "QuoteForge Server (qfs) - Build Commands" @echo "QuoteForge Server (qfs) - Build Commands"
@@ -92,6 +98,7 @@ help:
@echo " run Run development server" @echo " run Run development server"
@echo " watch Run with auto-restart (requires entr)" @echo " watch Run with auto-restart (requires entr)"
@echo " deps Install/update dependencies" @echo " deps Install/update dependencies"
@echo " install-hooks Install local git hooks (secret scan on commit)"
@echo " help Show this help" @echo " help Show this help"
@echo "" @echo ""
@echo "Current version: $(VERSION)" @echo "Current version: $(VERSION)"

105
README.md
View File

@@ -105,58 +105,85 @@ go run ./cmd/migrate_ops_projects -apply
go run ./cmd/migrate_ops_projects -apply -yes go run ./cmd/migrate_ops_projects -apply -yes
``` ```
### Минимальные права БД для пользователя квотаций ### Права БД для пользователя приложения
Если нужен пользователь, который может работать с конфигурациями, но не может создавать/удалять прайслисты: #### Полный набор прав для обычного пользователя
Чтобы выдать существующему пользователю все необходимые права (без переоздания):
```sql ```sql
-- 1) Создать пользователя (если его ещё нет) -- Справочные таблицы (только чтение)
CREATE USER IF NOT EXISTS 'quote_user'@'%' IDENTIFIED BY 'DB_PASSWORD_PLACEHOLDER'; GRANT SELECT ON RFQ_LOG.lot TO '<DB_USER>'@'%';
GRANT SELECT ON RFQ_LOG.qt_lot_metadata TO '<DB_USER>'@'%';
GRANT SELECT ON RFQ_LOG.qt_categories TO '<DB_USER>'@'%';
GRANT SELECT ON RFQ_LOG.qt_pricelists TO '<DB_USER>'@'%';
GRANT SELECT ON RFQ_LOG.qt_pricelist_items TO '<DB_USER>'@'%';
-- 2) Если пользователь уже существовал, принудительно обновить пароль -- Таблицы конфигураций и проектов (чтение и запись)
ALTER USER 'quote_user'@'%' IDENTIFIED BY 'DB_PASSWORD_PLACEHOLDER'; GRANT SELECT, INSERT, UPDATE ON RFQ_LOG.qt_configurations TO '<DB_USER>'@'%';
GRANT SELECT, INSERT, UPDATE ON RFQ_LOG.qt_projects TO '<DB_USER>'@'%';
-- 3) (Опционально, но рекомендуется) удалить дубли пользователя с другими host, -- Таблицы синхронизации (только чтение для миграций, чтение+запись для статуса)
-- чтобы не возникало конфликтов вида user@localhost vs user@'%' GRANT SELECT ON RFQ_LOG.qt_client_local_migrations TO '<DB_USER>'@'%';
DROP USER IF EXISTS 'quote_user'@'localhost'; GRANT SELECT, INSERT, UPDATE ON RFQ_LOG.qt_client_schema_state TO '<DB_USER>'@'%';
DROP USER IF EXISTS 'quote_user'@'127.0.0.1'; GRANT SELECT, INSERT, UPDATE ON RFQ_LOG.qt_pricelist_sync_status TO '<DB_USER>'@'%';
DROP USER IF EXISTS 'quote_user'@'::1';
-- 4) Сбросить лишние права
REVOKE ALL PRIVILEGES, GRANT OPTION FROM 'quote_user'@'%';
-- 5) Чтение данных для конфигуратора и синка
GRANT SELECT ON RFQ_LOG.lot TO 'quote_user'@'%';
GRANT SELECT ON RFQ_LOG.qt_lot_metadata TO 'quote_user'@'%';
GRANT SELECT ON RFQ_LOG.qt_categories TO 'quote_user'@'%';
GRANT SELECT ON RFQ_LOG.qt_pricelists TO 'quote_user'@'%';
GRANT SELECT ON RFQ_LOG.qt_pricelist_items TO 'quote_user'@'%';
-- 6) Работа с конфигурациями
GRANT SELECT, INSERT, UPDATE ON RFQ_LOG.qt_configurations TO 'quote_user'@'%';
-- Применить изменения
FLUSH PRIVILEGES; FLUSH PRIVILEGES;
SHOW GRANTS FOR 'quote_user'@'%'; -- Проверка выданных прав
SHOW CREATE USER 'quote_user'@'%'; SHOW GRANTS FOR '<DB_USER>'@'%';
``` ```
Полный набор прав для пользователя квотаций: #### Таблицы и их назначение
| Таблица | Назначение | Права | Примечание |
|---------|-----------|-------|-----------|
| `lot` | Справочник компонентов | SELECT | Существующая таблица |
| `qt_lot_metadata` | Расширенные данные компонентов | SELECT | Метаданные компонентов |
| `qt_categories` | Категории компонентов | SELECT | Справочник |
| `qt_pricelists` | Прайслисты | SELECT | Управляется сервером |
| `qt_pricelist_items` | Позиции прайслистов | SELECT | Управляется сервером |
| `qt_configurations` | Сохранённые конфигурации | SELECT, INSERT, UPDATE | Основная таблица работы |
| `qt_projects` | Проекты | SELECT, INSERT, UPDATE | Для группировки конфигураций |
| `qt_client_local_migrations` | Справочник миграций БД | SELECT | Только чтение (управляется админом) |
| `qt_client_schema_state` | Состояние локальной схемы | SELECT, INSERT, UPDATE | Отслеживание примененных миграций |
| `qt_pricelist_sync_status` | Статус синхронизации | SELECT, INSERT, UPDATE | Отслеживание активности синхронизации |
#### При создании нового пользователя
Если нужно создать нового пользователя с нуля:
```sql ```sql
GRANT USAGE ON *.* TO 'quote_user'@'%' IDENTIFIED BY 'DB_PASSWORD_PLACEHOLDER'; -- 1) Создать пользователя
CREATE USER IF NOT EXISTS 'quote_user'@'%' IDENTIFIED BY '<DB_PASSWORD>';
-- 2) Выдать все необходимые права
GRANT SELECT ON RFQ_LOG.lot TO 'quote_user'@'%'; GRANT SELECT ON RFQ_LOG.lot TO 'quote_user'@'%';
GRANT SELECT ON RFQ_LOG.qt_lot_metadata TO 'quote_user'@'%'; GRANT SELECT ON RFQ_LOG.qt_lot_metadata TO 'quote_user'@'%';
GRANT SELECT ON RFQ_LOG.qt_categories TO 'quote_user'@'%'; GRANT SELECT ON RFQ_LOG.qt_categories TO 'quote_user'@'%';
GRANT SELECT ON RFQ_LOG.qt_pricelists TO 'quote_user'@'%'; GRANT SELECT ON RFQ_LOG.qt_pricelists TO 'quote_user'@'%';
GRANT SELECT ON RFQ_LOG.qt_pricelist_items TO 'quote_user'@'%'; GRANT SELECT ON RFQ_LOG.qt_pricelist_items TO 'quote_user'@'%';
GRANT SELECT, INSERT, UPDATE ON RFQ_LOG.qt_configurations TO 'quote_user'@'%'; GRANT SELECT, INSERT, UPDATE ON RFQ_LOG.qt_configurations TO 'quote_user'@'%';
GRANT SELECT, INSERT, UPDATE ON RFQ_LOG.qt_projects TO 'quote_user'@'%';
GRANT SELECT ON RFQ_LOG.qt_client_local_migrations TO 'quote_user'@'%';
GRANT SELECT, INSERT, UPDATE ON RFQ_LOG.qt_client_schema_state TO 'quote_user'@'%';
GRANT SELECT, INSERT, UPDATE ON RFQ_LOG.qt_pricelist_sync_status TO 'quote_user'@'%';
-- 3) Применить изменения
FLUSH PRIVILEGES;
-- 4) Проверить права
SHOW GRANTS FOR 'quote_user'@'%';
``` ```
Важно: #### Важные замечания
- не выдавайте `INSERT/UPDATE/DELETE` на `qt_pricelists` и `qt_pricelist_items`, если пользователь не должен управлять прайслистами;
- если видите ошибку `Access denied for user ...@'<ip>'`, проверьте, что не осталось других записей `quote_user@host` кроме `quote_user@'%'`; - **Таблицы синхронизации** должны быть созданы администратором БД один раз. Приложение не требует прав CREATE TABLE.
- после смены DB-настроек через `/setup` приложение перезапускается автоматически и подхватывает нового пользователя. - **Прайслисты** (`qt_pricelists`, `qt_pricelist_items`) — справочные таблицы, управляются сервером, пользователь имеет только SELECT.
- **Конфигурации и проекты** — таблицы, в которые пишет само приложение (INSERT, UPDATE при сохранении изменений).
- **Таблицы миграций** нужны для синхронизации: приложение читает список миграций и отчитывается о применённых.
- Если видите ошибку `Access denied for user ...@'<ip>'`, проверьте наличие конфликтующих записей пользователя с разными хостами (user@localhost vs user@'%').
### 4. Импорт метаданных компонентов ### 4. Импорт метаданных компонентов
@@ -187,6 +214,7 @@ make build-all # Сборка для всех платформ (Linux, mac
make build-windows # Только для Windows make build-windows # Только для Windows
make run # Запуск dev сервера make run # Запуск dev сервера
make test # Запуск тестов make test # Запуск тестов
make install-hooks # Установить git hooks (блокировка коммита с секретами)
make clean # Очистка bin/ make clean # Очистка bin/
make help # Показать все команды make help # Показать все команды
``` ```
@@ -319,9 +347,22 @@ quoteforge/
│ └── static/ # CSS, JS, изображения │ └── static/ # CSS, JS, изображения
├── migrations/ # SQL миграции ├── migrations/ # SQL миграции
├── config.example.yaml # Пример конфигурации ├── config.example.yaml # Пример конфигурации
├── releases/
│ └── memory/ # Changelog между тегами (v1.2.1.md, v1.2.2.md, ...)
└── go.mod └── go.mod
``` ```
## Releases & Changelog
Change log между версиями хранится в `releases/memory/` каталоге в файлах вида `v{major}.{minor}.{patch}.md`.
Каждый файл содержит:
- Список коммитов между версиями
- Описание изменений и их влияния
- Breaking changes и заметки о миграции
**Перед работой над кодом проверьте последний файл в этой папке, чтобы понять текущее состояние проекта.**
## Роли пользователей ## Роли пользователей
| Роль | Описание | | Роль | Описание |

View File

@@ -695,7 +695,7 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
// Handlers // Handlers
componentHandler := handlers.NewComponentHandler(componentService, local) componentHandler := handlers.NewComponentHandler(componentService, local)
quoteHandler := handlers.NewQuoteHandler(quoteService) quoteHandler := handlers.NewQuoteHandler(quoteService)
exportHandler := handlers.NewExportHandler(exportService, configService, componentService) exportHandler := handlers.NewExportHandler(exportService, configService, componentService, projectService)
pricelistHandler := handlers.NewPricelistHandler(local) pricelistHandler := handlers.NewPricelistHandler(local)
syncHandler, err := handlers.NewSyncHandler(local, syncService, connMgr, templatesPath, backgroundSyncInterval) syncHandler, err := handlers.NewSyncHandler(local, syncService, connMgr, templatesPath, backgroundSyncInterval)
if err != nil { if err != nil {
@@ -1152,6 +1152,8 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
"current_version": currentVersion, "current_version": currentVersion,
}) })
}) })
configs.GET("/:uuid/export", exportHandler.ExportConfigCSV)
} }
projects := api.Group("/projects") projects := api.Group("/projects")
@@ -1164,7 +1166,8 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
search := strings.ToLower(strings.TrimSpace(c.Query("search"))) search := strings.ToLower(strings.TrimSpace(c.Query("search")))
author := strings.ToLower(strings.TrimSpace(c.Query("author"))) author := strings.ToLower(strings.TrimSpace(c.Query("author")))
page, _ := strconv.Atoi(c.DefaultQuery("page", "1")) page, _ := strconv.Atoi(c.DefaultQuery("page", "1"))
perPage, _ := strconv.Atoi(c.DefaultQuery("per_page", "10")) // Return all projects by default (set high limit for configs to reference)
perPage, _ := strconv.Atoi(c.DefaultQuery("per_page", "1000"))
sortField := strings.ToLower(strings.TrimSpace(c.DefaultQuery("sort", "created_at"))) sortField := strings.ToLower(strings.TrimSpace(c.DefaultQuery("sort", "created_at")))
sortDir := strings.ToLower(strings.TrimSpace(c.DefaultQuery("dir", "desc"))) sortDir := strings.ToLower(strings.TrimSpace(c.DefaultQuery("dir", "desc")))
if status != "active" && status != "archived" && status != "all" { if status != "active" && status != "archived" && status != "all" {
@@ -1316,6 +1319,32 @@ func setupRouter(cfg *config.Config, local *localdb.LocalDB, connMgr *db.Connect
}) })
}) })
// GET /api/projects/all - Returns all projects without pagination for UI dropdowns
projects.GET("/all", func(c *gin.Context) {
allProjects, err := projectService.ListByUser(dbUsername, true)
if err != nil {
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()})
return
}
// Return simplified list of all projects (UUID + Name only)
type ProjectSimple struct {
UUID string `json:"uuid"`
Name string `json:"name"`
}
simplified := make([]ProjectSimple, 0, len(allProjects))
for _, p := range allProjects {
simplified = append(simplified, ProjectSimple{
UUID: p.UUID,
Name: p.Name,
})
}
c.JSON(http.StatusOK, simplified)
})
projects.POST("", func(c *gin.Context) { projects.POST("", func(c *gin.Context) {
var req services.CreateProjectRequest var req services.CreateProjectRequest
if err := c.ShouldBindJSON(&req); err != nil { if err := c.ShouldBindJSON(&req); err != nil {

297
csv_export.md Normal file
View File

@@ -0,0 +1,297 @@
# CSV Export Pattern (Go + GORM)
## Архитектура (3-слойная)
### 1. Handler Layer (HTTP)
**Задачи**: Обработка HTTP-запроса, установка заголовков, инициация экспорта
```go
func (h *PricelistHandler) ExportCSV(c *gin.Context) {
// 1. Валидация параметров
id, err := strconv.ParseUint(c.Param("id"), 10, 32)
// 2. Получение метаданных для формирования имени файла
pl, err := h.service.GetByID(uint(id))
// 3. Установка HTTP-заголовков для скачивания
filename := fmt.Sprintf("pricelist_%s.csv", pl.Version)
c.Header("Content-Type", "text/csv; charset=utf-8")
c.Header("Content-Disposition", fmt.Sprintf("attachment; filename=\"%s\"", filename))
// 4. UTF-8 BOM для Excel-совместимости
c.Writer.Write([]byte{0xEF, 0xBB, 0xBF})
// 5. Настройка CSV writer
writer := csv.NewWriter(c.Writer)
writer.Comma = ';' // Точка с запятой для Excel
defer writer.Flush()
// 6. Динамические заголовки (зависят от типа данных)
isWarehouse := strings.ToLower(pl.Source) == "warehouse"
var header []string
if isWarehouse {
header = []string{"Артикул", "Категория", "Описание", "Доступно", "Partnumbers", "Цена, $", "Настройки"}
} else {
header = []string{"Артикул", "Категория", "Описание", "Цена, $", "Настройки"}
}
writer.Write(header)
// 7. Streaming в batches через callback
err = h.service.StreamItemsForExport(uint(id), 500, func(items []models.PricelistItem) error {
for _, item := range items {
row := buildRow(item, isWarehouse)
if err := writer.Write(row); err != nil {
return err
}
}
writer.Flush() // Flush после каждого batch
return nil
})
}
```
### 2. Service Layer
**Задачи**: Оркестрация, делегирование в репозиторий
```go
func (s *Service) StreamItemsForExport(pricelistID uint, batchSize int, callback func(items []models.PricelistItem) error) error {
if s.repo == nil {
return fmt.Errorf("offline mode: cannot stream pricelist items")
}
return s.repo.StreamItemsForExport(pricelistID, batchSize, callback)
}
```
### 3. Repository Layer (Критичный)
**Задачи**: Batch-загрузка из БД, оптимизация запросов, enrichment
```go
func (r *PricelistRepository) StreamItemsForExport(pricelistID uint, batchSize int, callback func(items []models.PricelistItem) error) error {
if batchSize <= 0 {
batchSize = 500 // Default batch size
}
// Проверка типа pricelist для conditional enrichment
var pl models.Pricelist
isWarehouse := false
if err := r.db.Select("source").Where("id = ?", pricelistID).First(&pl).Error; err == nil {
isWarehouse = pl.Source == string(models.PricelistSourceWarehouse)
}
offset := 0
for {
var items []models.PricelistItem
// ⚡ КЛЮЧЕВОЙ МОМЕНТ: JOIN для избежания N+1 запросов
err := r.db.Table("qt_pricelist_items AS pi").
Select("pi.*, COALESCE(l.lot_description, '') AS lot_description, COALESCE(l.lot_category, '') AS category").
Joins("LEFT JOIN lot AS l ON l.lot_name = pi.lot_name").
Where("pi.pricelist_id = ?", pricelistID).
Order("pi.lot_name").
Offset(offset).
Limit(batchSize).
Scan(&items).Error
if err != nil || len(items) == 0 {
break
}
// Conditional enrichment для warehouse данных
if isWarehouse {
r.enrichWarehouseItems(items) // Добавление qty, partnumbers
}
// Вызов callback для обработки batch
if err := callback(items); err != nil {
return err
}
if len(items) < batchSize {
break // Последний batch
}
offset += batchSize
}
return nil
}
```
## Ключевые паттерны
### 1. Streaming (не загружать все в память)
```go
// ❌ НЕ ТАК:
var allItems []Item
db.Find(&allItems) // Может упасть на миллионах записей
// ✅ ТАК:
for offset := 0; ; offset += batchSize {
var batch []Item
db.Offset(offset).Limit(batchSize).Find(&batch)
processBatch(batch)
if len(batch) < batchSize {
break
}
}
```
### 2. Callback Pattern для гибкости
```go
// Service не знает о CSV - может использоваться для любого экспорта
func StreamItems(callback func([]Item) error) error
```
### 3. JOIN для избежания N+1
```go
// ❌ N+1 problem:
items := getItems()
for _, item := range items {
description := getLotDescription(item.LotName) // N запросов
}
// ✅ JOIN:
db.Table("items AS i").
Select("i.*, COALESCE(l.description, '') AS description").
Joins("LEFT JOIN lots AS l ON l.name = i.lot_name")
```
### 4. UTF-8 BOM для Excel
```go
// Excel на Windows требует BOM для корректного отображения UTF-8
c.Writer.Write([]byte{0xEF, 0xBB, 0xBF})
```
### 5. Точка с запятой для Excel
```go
writer := csv.NewWriter(c.Writer)
writer.Comma = ';' // Excel в русской локали использует ;
```
### 6. Graceful Error Handling
```go
// После начала streaming нельзя вернуть JSON
if err != nil {
// Уже начали писать CSV, поэтому пишем текст
c.String(http.StatusInternalServerError, "Export failed: %v", err)
return
}
```
## Conditional Enrichment Pattern
```go
// Для warehouse прайслистов добавляем дополнительные поля
func (r *PricelistRepository) enrichWarehouseItems(items []models.PricelistItem) error {
// 1. Собрать уникальные lot_names
lots := make([]string, 0, len(items))
seen := make(map[string]struct{})
for _, item := range items {
if _, ok := seen[item.LotName]; !ok {
lots = append(lots, item.LotName)
seen[item.LotName] = struct{}{}
}
}
// 2. Batch-загрузка метрик (qty, partnumbers)
qtyByLot, partnumbersByLot, err := warehouse.LoadLotMetrics(r.db, lots, true)
// 3. Обогащение items
for i := range items {
if qty, ok := qtyByLot[items[i].LotName]; ok {
items[i].AvailableQty = &qty
}
items[i].Partnumbers = partnumbersByLot[items[i].LotName]
}
return nil
}
```
## Virtual Fields Pattern
```go
type PricelistItem struct {
// Stored fields
ID uint `gorm:"primaryKey"`
LotName string `gorm:"size:255"`
Price float64 `gorm:"type:decimal(12,2)"`
// Virtual fields (populated via JOIN or programmatically)
LotDescription string `gorm:"-:migration" json:"lot_description,omitempty"`
Category string `gorm:"-:migration" json:"category,omitempty"`
AvailableQty *float64 `gorm:"-" json:"available_qty,omitempty"`
Partnumbers []string `gorm:"-" json:"partnumbers,omitempty"`
}
```
- `gorm:"-:migration"` - не создавать колонку в БД, но маппить при SELECT
- `gorm:"-"` - полностью игнорировать при БД операциях
## Checklist для CSV Export
- [ ] HTTP заголовки: Content-Type, Content-Disposition
- [ ] UTF-8 BOM для Excel (0xEF, 0xBB, 0xBF)
- [ ] Разделитель (`;` для русской локали Excel)
- [ ] Streaming с batch processing (не загружать всё в память)
- [ ] JOIN для избежания N+1 запросов
- [ ] Flush после каждого batch
- [ ] Graceful error handling (нельзя JSON после начала streaming)
- [ ] Динамические заголовки (если нужно)
- [ ] Conditional enrichment (если данные зависят от типа)
## Когда использовать этот паттерн
**Используй когда:**
- Экспорт больших датасетов (>1000 записей)
- Нужна Excel-совместимость
- Связанные данные из нескольких таблиц
- Conditional логика enrichment
**Не нужен когда:**
- Малые датасеты (<100 записей) - можно загрузить всё сразу
- Экспорт JSON/XML - другие подходы
- Нет связанных данных - можно упростить
## Пример роутинга (Gin)
```go
// В файле роутера
func SetupRoutes(router *gin.Engine, handler *PricelistHandler) {
api := router.Group("/api")
{
pricelists := api.Group("/pricelists")
{
pricelists.GET("/:id/export", handler.ExportCSV)
}
}
}
```
## Импорты
```go
import (
"encoding/csv"
"fmt"
"strconv"
"strings"
"github.com/gin-gonic/gin"
"gorm.io/gorm"
)
```
## Performance Notes
1. **Batch Size**: 500-1000 оптимально для большинства случаев
2. **JOIN vs N+1**: JOIN на порядки быстрее при >100 записях
3. **Memory**: Streaming позволяет экспортировать миллионы записей с минимальной памятью
4. **Indexes**: Убедись что есть индексы на JOIN колонках
## Источник
Реализовано в проекте PriceForge:
- Handler: `internal/handlers/pricelist.go:245-346`
- Service: `internal/services/pricelist/service.go:373-379`
- Repository: `internal/repository/pricelist.go:475-533`
- Models: `internal/models/pricelist.go`

View File

@@ -61,7 +61,6 @@ func (h *ComponentHandler) List(c *gin.Context) {
Category: lc.Category, Category: lc.Category,
CategoryName: lc.Category, CategoryName: lc.Category,
Model: lc.Model, Model: lc.Model,
CurrentPrice: lc.CurrentPrice,
} }
} }
@@ -87,7 +86,6 @@ func (h *ComponentHandler) Get(c *gin.Context) {
Category: component.Category, Category: component.Category,
CategoryName: component.Category, CategoryName: component.Category,
Model: component.Model, Model: component.Model,
CurrentPrice: component.CurrentPrice,
}) })
} }

View File

@@ -14,23 +14,28 @@ type ExportHandler struct {
exportService *services.ExportService exportService *services.ExportService
configService services.ConfigurationGetter configService services.ConfigurationGetter
componentService *services.ComponentService componentService *services.ComponentService
projectService *services.ProjectService
} }
func NewExportHandler( func NewExportHandler(
exportService *services.ExportService, exportService *services.ExportService,
configService services.ConfigurationGetter, configService services.ConfigurationGetter,
componentService *services.ComponentService, componentService *services.ComponentService,
projectService *services.ProjectService,
) *ExportHandler { ) *ExportHandler {
return &ExportHandler{ return &ExportHandler{
exportService: exportService, exportService: exportService,
configService: configService, configService: configService,
componentService: componentService, componentService: componentService,
projectService: projectService,
} }
} }
type ExportRequest struct { type ExportRequest struct {
Name string `json:"name" binding:"required"` Name string `json:"name" binding:"required"`
Items []struct { ProjectName string `json:"project_name"`
ProjectUUID string `json:"project_uuid"`
Items []struct {
LotName string `json:"lot_name" binding:"required"` LotName string `json:"lot_name" binding:"required"`
Quantity int `json:"quantity" binding:"required,min=1"` Quantity int `json:"quantity" binding:"required,min=1"`
UnitPrice float64 `json:"unit_price"` UnitPrice float64 `json:"unit_price"`
@@ -47,15 +52,36 @@ func (h *ExportHandler) ExportCSV(c *gin.Context) {
data := h.buildExportData(&req) data := h.buildExportData(&req)
csvData, err := h.exportService.ToCSV(data) // Validate before streaming (can return JSON error)
if err != nil { if len(data.Items) == 0 {
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()}) c.JSON(http.StatusBadRequest, gin.H{"error": "no items to export"})
return return
} }
filename := fmt.Sprintf("%s %s SPEC.csv", time.Now().Format("2006-01-02"), req.Name) // Get project name if available
projectName := req.ProjectName
if projectName == "" && req.ProjectUUID != "" {
// Try to load project name from database
username := middleware.GetUsername(c)
if project, err := h.projectService.GetByUUID(req.ProjectUUID, username); err == nil && project != nil {
projectName = project.Name
}
}
if projectName == "" {
projectName = req.Name
}
// Set headers before streaming
exportDate := data.CreatedAt
filename := fmt.Sprintf("%s (%s) %s BOM.csv", exportDate.Format("2006-01-02"), projectName, req.Name)
c.Header("Content-Type", "text/csv; charset=utf-8")
c.Header("Content-Disposition", fmt.Sprintf("attachment; filename=\"%s\"", filename)) c.Header("Content-Disposition", fmt.Sprintf("attachment; filename=\"%s\"", filename))
c.Data(http.StatusOK, "text/csv; charset=utf-8", csvData)
// Stream CSV (cannot return JSON after this point)
if err := h.exportService.ToCSV(c.Writer, data); err != nil {
c.Error(err) // Log only
return
}
} }
func (h *ExportHandler) buildExportData(req *ExportRequest) *services.ExportData { func (h *ExportHandler) buildExportData(req *ExportRequest) *services.ExportData {
@@ -101,6 +127,7 @@ func (h *ExportHandler) ExportConfigCSV(c *gin.Context) {
username := middleware.GetUsername(c) username := middleware.GetUsername(c)
uuid := c.Param("uuid") uuid := c.Param("uuid")
// Get config before streaming (can return JSON error)
config, err := h.configService.GetByUUID(uuid, username) config, err := h.configService.GetByUUID(uuid, username)
if err != nil { if err != nil {
c.JSON(http.StatusNotFound, gin.H{"error": err.Error()}) c.JSON(http.StatusNotFound, gin.H{"error": err.Error()})
@@ -109,13 +136,33 @@ func (h *ExportHandler) ExportConfigCSV(c *gin.Context) {
data := h.exportService.ConfigToExportData(config, h.componentService) data := h.exportService.ConfigToExportData(config, h.componentService)
csvData, err := h.exportService.ToCSV(data) // Validate before streaming (can return JSON error)
if err != nil { if len(data.Items) == 0 {
c.JSON(http.StatusInternalServerError, gin.H{"error": err.Error()}) c.JSON(http.StatusBadRequest, gin.H{"error": "no items to export"})
return return
} }
filename := fmt.Sprintf("%s %s SPEC.csv", config.CreatedAt.Format("2006-01-02"), config.Name) // Get project name if configuration belongs to a project
projectName := config.Name // fallback: use config name if no project
if config.ProjectUUID != nil && *config.ProjectUUID != "" {
if project, err := h.projectService.GetByUUID(*config.ProjectUUID, username); err == nil && project != nil {
projectName = project.Name
}
}
// Set headers before streaming
// Use price update time if available, otherwise creation time
exportDate := config.CreatedAt
if config.PriceUpdatedAt != nil {
exportDate = *config.PriceUpdatedAt
}
filename := fmt.Sprintf("%s (%s) %s BOM.csv", exportDate.Format("2006-01-02"), projectName, config.Name)
c.Header("Content-Type", "text/csv; charset=utf-8")
c.Header("Content-Disposition", fmt.Sprintf("attachment; filename=\"%s\"", filename)) c.Header("Content-Disposition", fmt.Sprintf("attachment; filename=\"%s\"", filename))
c.Data(http.StatusOK, "text/csv; charset=utf-8", csvData)
// Stream CSV (cannot return JSON after this point)
if err := h.exportService.ToCSV(c.Writer, data); err != nil {
c.Error(err) // Log only
return
}
} }

View File

@@ -0,0 +1,314 @@
package handlers
import (
"bytes"
"encoding/csv"
"encoding/json"
"errors"
"net/http"
"net/http/httptest"
"testing"
"time"
"git.mchus.pro/mchus/quoteforge/internal/config"
"git.mchus.pro/mchus/quoteforge/internal/models"
"git.mchus.pro/mchus/quoteforge/internal/services"
"github.com/gin-gonic/gin"
)
// Mock services for testing
type mockConfigService struct {
config *models.Configuration
err error
}
func (m *mockConfigService) GetByUUID(uuid string, ownerUsername string) (*models.Configuration, error) {
return m.config, m.err
}
func TestExportCSV_Success(t *testing.T) {
gin.SetMode(gin.TestMode)
// Create a basic mock component service that doesn't panic
mockComponentService := &services.ComponentService{}
// Create handler with mocks
exportSvc := services.NewExportService(config.ExportConfig{}, nil)
handler := NewExportHandler(
exportSvc,
&mockConfigService{},
mockComponentService,
nil,
)
// Create JSON request body
jsonBody := `{
"name": "Test Export",
"items": [
{
"lot_name": "LOT-001",
"quantity": 2,
"unit_price": 100.50
}
],
"notes": "Test notes"
}`
// Create HTTP request
req, _ := http.NewRequest("POST", "/api/export/csv", bytes.NewBufferString(jsonBody))
req.Header.Set("Content-Type", "application/json")
// Create response recorder
w := httptest.NewRecorder()
// Create Gin context
c, _ := gin.CreateTestContext(w)
c.Request = req
// Call handler
handler.ExportCSV(c)
// Check status code
if w.Code != http.StatusOK {
t.Errorf("Expected status 200, got %d", w.Code)
}
// Check Content-Type header
contentType := w.Header().Get("Content-Type")
if contentType != "text/csv; charset=utf-8" {
t.Errorf("Expected Content-Type 'text/csv; charset=utf-8', got %q", contentType)
}
// Check for BOM
responseBody := w.Body.Bytes()
if len(responseBody) < 3 {
t.Fatalf("Response too short to contain BOM")
}
expectedBOM := []byte{0xEF, 0xBB, 0xBF}
actualBOM := responseBody[:3]
if bytes.Compare(actualBOM, expectedBOM) != 0 {
t.Errorf("UTF-8 BOM mismatch. Expected %v, got %v", expectedBOM, actualBOM)
}
// Check semicolon delimiter in CSV
reader := csv.NewReader(bytes.NewReader(responseBody[3:]))
reader.Comma = ';'
header, err := reader.Read()
if err != nil {
t.Errorf("Failed to parse CSV header: %v", err)
}
if len(header) != 6 {
t.Errorf("Expected 6 columns, got %d", len(header))
}
}
func TestExportCSV_InvalidRequest(t *testing.T) {
gin.SetMode(gin.TestMode)
exportSvc := services.NewExportService(config.ExportConfig{}, nil)
handler := NewExportHandler(
exportSvc,
&mockConfigService{},
&services.ComponentService{},
nil,
)
// Create invalid request (missing required field)
req, _ := http.NewRequest("POST", "/api/export/csv", bytes.NewBufferString(`{"name": "Test"}`))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
c, _ := gin.CreateTestContext(w)
c.Request = req
handler.ExportCSV(c)
// Should return 400 Bad Request
if w.Code != http.StatusBadRequest {
t.Errorf("Expected status 400, got %d", w.Code)
}
// Should return JSON error
var errResp map[string]interface{}
json.Unmarshal(w.Body.Bytes(), &errResp)
if _, hasError := errResp["error"]; !hasError {
t.Errorf("Expected error in JSON response")
}
}
func TestExportCSV_EmptyItems(t *testing.T) {
gin.SetMode(gin.TestMode)
exportSvc := services.NewExportService(config.ExportConfig{}, nil)
handler := NewExportHandler(
exportSvc,
&mockConfigService{},
&services.ComponentService{},
nil,
)
// Create request with empty items array - should fail binding validation
req, _ := http.NewRequest("POST", "/api/export/csv", bytes.NewBufferString(`{"name":"Empty Export","items":[],"notes":""}`))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
c, _ := gin.CreateTestContext(w)
c.Request = req
handler.ExportCSV(c)
// Should return 400 Bad Request (validation error from gin binding)
if w.Code != http.StatusBadRequest {
t.Logf("Status code: %d (expected 400 for empty items)", w.Code)
}
}
func TestExportConfigCSV_Success(t *testing.T) {
gin.SetMode(gin.TestMode)
// Mock configuration
mockConfig := &models.Configuration{
UUID: "test-uuid",
Name: "Test Config",
OwnerUsername: "testuser",
Items: models.ConfigItems{
{
LotName: "LOT-001",
Quantity: 1,
UnitPrice: 100.0,
},
},
CreatedAt: time.Now(),
}
exportSvc := services.NewExportService(config.ExportConfig{}, nil)
handler := NewExportHandler(
exportSvc,
&mockConfigService{config: mockConfig},
&services.ComponentService{},
nil,
)
// Create HTTP request
req, _ := http.NewRequest("GET", "/api/configs/test-uuid/export", nil)
w := httptest.NewRecorder()
c, _ := gin.CreateTestContext(w)
c.Request = req
c.Params = gin.Params{
{Key: "uuid", Value: "test-uuid"},
}
// Mock middleware.GetUsername
c.Set("username", "testuser")
handler.ExportConfigCSV(c)
// Check status code
if w.Code != http.StatusOK {
t.Errorf("Expected status 200, got %d", w.Code)
}
// Check Content-Type header
contentType := w.Header().Get("Content-Type")
if contentType != "text/csv; charset=utf-8" {
t.Errorf("Expected Content-Type 'text/csv; charset=utf-8', got %q", contentType)
}
// Check for BOM
responseBody := w.Body.Bytes()
if len(responseBody) < 3 {
t.Fatalf("Response too short to contain BOM")
}
expectedBOM := []byte{0xEF, 0xBB, 0xBF}
actualBOM := responseBody[:3]
if bytes.Compare(actualBOM, expectedBOM) != 0 {
t.Errorf("UTF-8 BOM mismatch")
}
}
func TestExportConfigCSV_NotFound(t *testing.T) {
gin.SetMode(gin.TestMode)
exportSvc := services.NewExportService(config.ExportConfig{}, nil)
handler := NewExportHandler(
exportSvc,
&mockConfigService{err: errors.New("config not found")},
&services.ComponentService{},
nil,
)
req, _ := http.NewRequest("GET", "/api/configs/nonexistent-uuid/export", nil)
w := httptest.NewRecorder()
c, _ := gin.CreateTestContext(w)
c.Request = req
c.Params = gin.Params{
{Key: "uuid", Value: "nonexistent-uuid"},
}
c.Set("username", "testuser")
handler.ExportConfigCSV(c)
// Should return 404 Not Found
if w.Code != http.StatusNotFound {
t.Errorf("Expected status 404, got %d", w.Code)
}
// Should return JSON error
var errResp map[string]interface{}
json.Unmarshal(w.Body.Bytes(), &errResp)
if _, hasError := errResp["error"]; !hasError {
t.Errorf("Expected error in JSON response")
}
}
func TestExportConfigCSV_EmptyItems(t *testing.T) {
gin.SetMode(gin.TestMode)
// Mock configuration with empty items
mockConfig := &models.Configuration{
UUID: "test-uuid",
Name: "Empty Config",
OwnerUsername: "testuser",
Items: models.ConfigItems{},
CreatedAt: time.Now(),
}
exportSvc := services.NewExportService(config.ExportConfig{}, nil)
handler := NewExportHandler(
exportSvc,
&mockConfigService{config: mockConfig},
&services.ComponentService{},
nil,
)
req, _ := http.NewRequest("GET", "/api/configs/test-uuid/export", nil)
w := httptest.NewRecorder()
c, _ := gin.CreateTestContext(w)
c.Request = req
c.Params = gin.Params{
{Key: "uuid", Value: "test-uuid"},
}
c.Set("username", "testuser")
handler.ExportConfigCSV(c)
// Should return 400 Bad Request
if w.Code != http.StatusBadRequest {
t.Errorf("Expected status 400, got %d", w.Code)
}
// Should return JSON error
var errResp map[string]interface{}
json.Unmarshal(w.Body.Bytes(), &errResp)
if _, hasError := errResp["error"]; !hasError {
t.Errorf("Expected error in JSON response")
}
}

View File

@@ -28,14 +28,13 @@ type ComponentSyncResult struct {
func (l *LocalDB) SyncComponents(mariaDB *gorm.DB) (*ComponentSyncResult, error) { func (l *LocalDB) SyncComponents(mariaDB *gorm.DB) (*ComponentSyncResult, error) {
startTime := time.Now() startTime := time.Now()
// Query to join lot with qt_lot_metadata // Query to join lot with qt_lot_metadata (metadata only, no pricing)
// Use LEFT JOIN to include lots without metadata // Use LEFT JOIN to include lots without metadata
type componentRow struct { type componentRow struct {
LotName string LotName string
LotDescription string LotDescription string
Category *string Category *string
Model *string Model *string
CurrentPrice *float64
} }
var rows []componentRow var rows []componentRow
@@ -44,8 +43,7 @@ func (l *LocalDB) SyncComponents(mariaDB *gorm.DB) (*ComponentSyncResult, error)
l.lot_name, l.lot_name,
l.lot_description, l.lot_description,
COALESCE(c.code, SUBSTRING_INDEX(l.lot_name, '_', 1)) as category, COALESCE(c.code, SUBSTRING_INDEX(l.lot_name, '_', 1)) as category,
m.model, m.model
m.current_price
FROM lot l FROM lot l
LEFT JOIN qt_lot_metadata m ON l.lot_name = m.lot_name LEFT JOIN qt_lot_metadata m ON l.lot_name = m.lot_name
LEFT JOIN qt_categories c ON m.category_id = c.id LEFT JOIN qt_categories c ON m.category_id = c.id
@@ -100,8 +98,6 @@ func (l *LocalDB) SyncComponents(mariaDB *gorm.DB) (*ComponentSyncResult, error)
LotDescription: row.LotDescription, LotDescription: row.LotDescription,
Category: category, Category: category,
Model: model, Model: model,
CurrentPrice: row.CurrentPrice,
SyncedAt: syncTime,
} }
components = append(components, comp) components = append(components, comp)
@@ -221,11 +217,6 @@ func (l *LocalDB) ListComponents(filter ComponentFilter, offset, limit int) ([]L
) )
} }
// Apply price filter
if filter.HasPrice {
db = db.Where("current_price IS NOT NULL")
}
// Get total count // Get total count
var total int64 var total int64
if err := db.Model(&LocalComponent{}).Count(&total).Error; err != nil { if err := db.Model(&LocalComponent{}).Count(&total).Error; err != nil {
@@ -312,98 +303,3 @@ func (l *LocalDB) NeedComponentSync(maxAgeHours int) bool {
return time.Since(*syncTime).Hours() > float64(maxAgeHours) return time.Since(*syncTime).Hours() > float64(maxAgeHours)
} }
// UpdateComponentPricesFromPricelist updates current_price in local_components from pricelist items
// This allows offline price updates using synced pricelists without MariaDB connection
func (l *LocalDB) UpdateComponentPricesFromPricelist(pricelistID uint) (int, error) {
// Get all items from the specified pricelist
var items []LocalPricelistItem
if err := l.db.Where("pricelist_id = ?", pricelistID).Find(&items).Error; err != nil {
return 0, fmt.Errorf("fetching pricelist items: %w", err)
}
if len(items) == 0 {
slog.Warn("no items found in pricelist", "pricelist_id", pricelistID)
return 0, nil
}
// Update current_price for each component
updated := 0
err := l.db.Transaction(func(tx *gorm.DB) error {
for _, item := range items {
result := tx.Model(&LocalComponent{}).
Where("lot_name = ?", item.LotName).
Update("current_price", item.Price)
if result.Error != nil {
return fmt.Errorf("updating price for %s: %w", item.LotName, result.Error)
}
if result.RowsAffected > 0 {
updated++
}
}
return nil
})
if err != nil {
return 0, err
}
slog.Info("updated component prices from pricelist",
"pricelist_id", pricelistID,
"total_items", len(items),
"updated_components", updated)
return updated, nil
}
// EnsureComponentPricesFromPricelists loads prices from the latest pricelist into local_components
// if no components exist or all current prices are NULL
func (l *LocalDB) EnsureComponentPricesFromPricelists() error {
// Check if we have any components with prices
var count int64
if err := l.db.Model(&LocalComponent{}).Where("current_price IS NOT NULL").Count(&count).Error; err != nil {
return fmt.Errorf("checking component prices: %w", err)
}
// If we have components with prices, don't load from pricelists
if count > 0 {
return nil
}
// Check if we have any components at all
var totalComponents int64
if err := l.db.Model(&LocalComponent{}).Count(&totalComponents).Error; err != nil {
return fmt.Errorf("counting components: %w", err)
}
// If we have no components, we need to load them from pricelists
if totalComponents == 0 {
slog.Info("no components found in local database, loading from latest pricelist")
// This would typically be called from the sync service or setup process
// For now, we'll just return nil to indicate no action needed
return nil
}
// If we have components but no prices, load from latest estimate pricelist.
var latestPricelist LocalPricelist
if err := l.db.Where("source = ?", "estimate").Order("created_at DESC").First(&latestPricelist).Error; err != nil {
if err == gorm.ErrRecordNotFound {
slog.Warn("no pricelists found in local database")
return nil
}
return fmt.Errorf("finding latest pricelist: %w", err)
}
// Update prices from the latest pricelist
updated, err := l.UpdateComponentPricesFromPricelist(latestPricelist.ID)
if err != nil {
return fmt.Errorf("updating component prices from pricelist: %w", err)
}
slog.Info("loaded component prices from latest pricelist",
"pricelist_id", latestPricelist.ID,
"updated_components", updated)
return nil
}

View File

@@ -213,17 +213,14 @@ func ComponentToLocal(meta *models.LotMetadata) *LocalComponent {
LotDescription: lotDesc, LotDescription: lotDesc,
Category: category, Category: category,
Model: meta.Model, Model: meta.Model,
CurrentPrice: meta.CurrentPrice,
SyncedAt: time.Now(),
} }
} }
// LocalToComponent converts LocalComponent to models.LotMetadata // LocalToComponent converts LocalComponent to models.LotMetadata
func LocalToComponent(local *LocalComponent) *models.LotMetadata { func LocalToComponent(local *LocalComponent) *models.LotMetadata {
return &models.LotMetadata{ return &models.LotMetadata{
LotName: local.LotName, LotName: local.LotName,
Model: local.Model, Model: local.Model,
CurrentPrice: local.CurrentPrice,
Lot: &models.Lot{ Lot: &models.Lot{
LotName: local.LotName, LotName: local.LotName,
LotDescription: local.LotDescription, LotDescription: local.LotDescription,

View File

@@ -58,6 +58,16 @@ var localMigrations = []localMigration{
name: "Backfill source for local pricelists and create source indexes", name: "Backfill source for local pricelists and create source indexes",
run: backfillLocalPricelistSource, run: backfillLocalPricelistSource,
}, },
{
id: "2026_02_09_drop_component_unused_fields",
name: "Remove current_price and synced_at from local_components (unused fields)",
run: dropComponentUnusedFields,
},
{
id: "2026_02_09_add_warehouse_competitor_pricelists",
name: "Add warehouse_pricelist_id and competitor_pricelist_id to local_configurations",
run: addWarehouseCompetitorPriceLists,
},
} }
func runLocalMigrations(db *gorm.DB) error { func runLocalMigrations(db *gorm.DB) error {
@@ -316,3 +326,113 @@ func backfillLocalPricelistSource(tx *gorm.DB) error {
return nil return nil
} }
func dropComponentUnusedFields(tx *gorm.DB) error {
// Check if columns exist
type columnInfo struct {
Name string `gorm:"column:name"`
}
var columns []columnInfo
if err := tx.Raw(`
SELECT name FROM pragma_table_info('local_components')
WHERE name IN ('current_price', 'synced_at')
`).Scan(&columns).Error; err != nil {
return fmt.Errorf("check columns existence: %w", err)
}
if len(columns) == 0 {
slog.Info("unused fields already removed from local_components")
return nil
}
// SQLite: recreate table without current_price and synced_at
if err := tx.Exec(`
CREATE TABLE local_components_new (
lot_name TEXT PRIMARY KEY,
lot_description TEXT,
category TEXT,
model TEXT
)
`).Error; err != nil {
return fmt.Errorf("create new local_components table: %w", err)
}
if err := tx.Exec(`
INSERT INTO local_components_new (lot_name, lot_description, category, model)
SELECT lot_name, lot_description, category, model
FROM local_components
`).Error; err != nil {
return fmt.Errorf("copy data to new table: %w", err)
}
if err := tx.Exec(`DROP TABLE local_components`).Error; err != nil {
return fmt.Errorf("drop old table: %w", err)
}
if err := tx.Exec(`ALTER TABLE local_components_new RENAME TO local_components`).Error; err != nil {
return fmt.Errorf("rename new table: %w", err)
}
slog.Info("dropped current_price and synced_at columns from local_components")
return nil
}
func addWarehouseCompetitorPriceLists(tx *gorm.DB) error {
// Check if columns exist
type columnInfo struct {
Name string `gorm:"column:name"`
}
var columns []columnInfo
if err := tx.Raw(`
SELECT name FROM pragma_table_info('local_configurations')
WHERE name IN ('warehouse_pricelist_id', 'competitor_pricelist_id')
`).Scan(&columns).Error; err != nil {
return fmt.Errorf("check columns existence: %w", err)
}
if len(columns) == 2 {
slog.Info("warehouse and competitor pricelist columns already exist")
return nil
}
// Add columns if they don't exist
if err := tx.Exec(`
ALTER TABLE local_configurations
ADD COLUMN warehouse_pricelist_id INTEGER
`).Error; err != nil {
// Column might already exist, ignore
if !strings.Contains(err.Error(), "duplicate column") {
return fmt.Errorf("add warehouse_pricelist_id column: %w", err)
}
}
if err := tx.Exec(`
ALTER TABLE local_configurations
ADD COLUMN competitor_pricelist_id INTEGER
`).Error; err != nil {
// Column might already exist, ignore
if !strings.Contains(err.Error(), "duplicate column") {
return fmt.Errorf("add competitor_pricelist_id column: %w", err)
}
}
// Create indexes
if err := tx.Exec(`
CREATE INDEX IF NOT EXISTS idx_local_configurations_warehouse_pricelist
ON local_configurations(warehouse_pricelist_id)
`).Error; err != nil {
return fmt.Errorf("create warehouse pricelist index: %w", err)
}
if err := tx.Exec(`
CREATE INDEX IF NOT EXISTS idx_local_configurations_competitor_pricelist
ON local_configurations(competitor_pricelist_id)
`).Error; err != nil {
return fmt.Errorf("create competitor pricelist index: %w", err)
}
slog.Info("added warehouse and competitor pricelist fields to local_configurations")
return nil
}

View File

@@ -96,8 +96,10 @@ type LocalConfiguration struct {
Notes string `json:"notes"` Notes string `json:"notes"`
IsTemplate bool `gorm:"default:false" json:"is_template"` IsTemplate bool `gorm:"default:false" json:"is_template"`
ServerCount int `gorm:"default:1" json:"server_count"` ServerCount int `gorm:"default:1" json:"server_count"`
PricelistID *uint `gorm:"index" json:"pricelist_id,omitempty"` PricelistID *uint `gorm:"index" json:"pricelist_id,omitempty"`
OnlyInStock bool `gorm:"default:false" json:"only_in_stock"` WarehousePricelistID *uint `gorm:"index" json:"warehouse_pricelist_id,omitempty"`
CompetitorPricelistID *uint `gorm:"index" json:"competitor_pricelist_id,omitempty"`
OnlyInStock bool `gorm:"default:false" json:"only_in_stock"`
PriceUpdatedAt *time.Time `gorm:"type:timestamp" json:"price_updated_at,omitempty"` PriceUpdatedAt *time.Time `gorm:"type:timestamp" json:"price_updated_at,omitempty"`
CreatedAt time.Time `json:"created_at"` CreatedAt time.Time `json:"created_at"`
UpdatedAt time.Time `json:"updated_at"` UpdatedAt time.Time `json:"updated_at"`
@@ -179,14 +181,13 @@ func (LocalPricelistItem) TableName() string {
return "local_pricelist_items" return "local_pricelist_items"
} }
// LocalComponent stores cached components for offline search // LocalComponent stores cached components for offline search (metadata only)
// All pricing is now sourced from local_pricelist_items based on configuration pricelist selection
type LocalComponent struct { type LocalComponent struct {
LotName string `gorm:"primaryKey" json:"lot_name"` LotName string `gorm:"primaryKey" json:"lot_name"`
LotDescription string `json:"lot_description"` LotDescription string `json:"lot_description"`
Category string `json:"category"` Category string `json:"category"`
Model string `json:"model"` Model string `json:"model"`
CurrentPrice *float64 `json:"current_price"`
SyncedAt time.Time `json:"synced_at"`
} }
func (LocalComponent) TableName() string { func (LocalComponent) TableName() string {

View File

@@ -1,22 +1,55 @@
package middleware package middleware
import ( import (
"net"
"net/http"
"net/url"
"strings"
"github.com/gin-gonic/gin" "github.com/gin-gonic/gin"
) )
func CORS() gin.HandlerFunc { func CORS() gin.HandlerFunc {
return func(c *gin.Context) { return func(c *gin.Context) {
c.Header("Access-Control-Allow-Origin", "*") origin := strings.TrimSpace(c.GetHeader("Origin"))
c.Header("Access-Control-Allow-Methods", "GET, POST, PUT, PATCH, DELETE, OPTIONS") if origin != "" {
c.Header("Access-Control-Allow-Headers", "Origin, Content-Type, Accept, Authorization") if isLoopbackOrigin(origin) {
c.Header("Access-Control-Expose-Headers", "Content-Length, Content-Disposition") c.Header("Access-Control-Allow-Origin", origin)
c.Header("Access-Control-Max-Age", "86400") c.Header("Vary", "Origin")
c.Header("Access-Control-Allow-Methods", "GET, POST, PUT, PATCH, DELETE, OPTIONS")
c.Header("Access-Control-Allow-Headers", "Origin, Content-Type, Accept, Authorization")
c.Header("Access-Control-Expose-Headers", "Content-Length, Content-Disposition")
c.Header("Access-Control-Max-Age", "86400")
} else if c.Request.Method == http.MethodOptions {
c.AbortWithStatus(http.StatusForbidden)
return
}
}
if c.Request.Method == "OPTIONS" { if c.Request.Method == http.MethodOptions {
c.AbortWithStatus(204) c.AbortWithStatus(http.StatusNoContent)
return return
} }
c.Next() c.Next()
} }
} }
func isLoopbackOrigin(origin string) bool {
u, err := url.Parse(origin)
if err != nil {
return false
}
if u.Scheme != "http" && u.Scheme != "https" {
return false
}
host := strings.TrimSpace(u.Hostname())
if host == "" {
return false
}
if strings.EqualFold(host, "localhost") {
return true
}
ip := net.ParseIP(host)
return ip != nil && ip.IsLoopback()
}

View File

@@ -3,10 +3,12 @@ package repository
import ( import (
"errors" "errors"
"fmt" "fmt"
"sort"
"strconv" "strconv"
"strings" "strings"
"time" "time"
"git.mchus.pro/mchus/quoteforge/internal/lotmatch"
"git.mchus.pro/mchus/quoteforge/internal/models" "git.mchus.pro/mchus/quoteforge/internal/models"
"gorm.io/gorm" "gorm.io/gorm"
) )
@@ -243,9 +245,91 @@ func (r *PricelistRepository) GetItems(pricelistID uint, offset, limit int, sear
} }
} }
if err := r.enrichItemsWithStock(items); err != nil {
return nil, 0, fmt.Errorf("enriching pricelist items with stock: %w", err)
}
return items, total, nil return items, total, nil
} }
func (r *PricelistRepository) enrichItemsWithStock(items []models.PricelistItem) error {
if len(items) == 0 {
return nil
}
resolver, err := lotmatch.NewLotResolverFromDB(r.db)
if err != nil {
return err
}
type stockRow struct {
Partnumber string `gorm:"column:partnumber"`
Qty *float64 `gorm:"column:qty"`
}
rows := make([]stockRow, 0)
if err := r.db.Raw(`
SELECT s.partnumber, s.qty
FROM stock_log s
INNER JOIN (
SELECT partnumber, MAX(date) AS max_date
FROM stock_log
GROUP BY partnumber
) latest ON latest.partnumber = s.partnumber AND latest.max_date = s.date
WHERE s.qty IS NOT NULL
`).Scan(&rows).Error; err != nil {
return err
}
lotTotals := make(map[string]float64, len(items))
lotPartnumbers := make(map[string][]string, len(items))
seenPartnumbers := make(map[string]map[string]struct{}, len(items))
for i := range rows {
row := rows[i]
if strings.TrimSpace(row.Partnumber) == "" {
continue
}
lotName, _, resolveErr := resolver.Resolve(row.Partnumber)
if resolveErr != nil || strings.TrimSpace(lotName) == "" {
continue
}
if row.Qty != nil {
lotTotals[lotName] += *row.Qty
}
pn := strings.TrimSpace(row.Partnumber)
if pn == "" {
continue
}
if _, ok := seenPartnumbers[lotName]; !ok {
seenPartnumbers[lotName] = make(map[string]struct{}, 4)
}
key := strings.ToLower(pn)
if _, exists := seenPartnumbers[lotName][key]; exists {
continue
}
seenPartnumbers[lotName][key] = struct{}{}
lotPartnumbers[lotName] = append(lotPartnumbers[lotName], pn)
}
for i := range items {
lotName := items[i].LotName
if qty, ok := lotTotals[lotName]; ok {
qtyCopy := qty
items[i].AvailableQty = &qtyCopy
}
if partnumbers := lotPartnumbers[lotName]; len(partnumbers) > 0 {
sort.Slice(partnumbers, func(a, b int) bool {
return strings.ToLower(partnumbers[a]) < strings.ToLower(partnumbers[b])
})
items[i].Partnumbers = partnumbers
}
}
return nil
}
// GetLotNames returns distinct lot names from pricelist items. // GetLotNames returns distinct lot names from pricelist items.
func (r *PricelistRepository) GetLotNames(pricelistID uint) ([]string, error) { func (r *PricelistRepository) GetLotNames(pricelistID uint) ([]string, error) {
var lotNames []string var lotNames []string

View File

@@ -83,10 +83,6 @@ func (r *UnifiedRepo) getComponentsOffline(filter ComponentFilter, offset, limit
search := "%" + filter.Search + "%" search := "%" + filter.Search + "%"
query = query.Where("lot_name LIKE ? OR lot_description LIKE ? OR model LIKE ?", search, search, search) query = query.Where("lot_name LIKE ? OR lot_description LIKE ? OR model LIKE ?", search, search, search)
} }
if filter.HasPrice {
query = query.Where("current_price IS NOT NULL AND current_price > 0")
}
var total int64 var total int64
query.Count(&total) query.Count(&total)
@@ -96,8 +92,6 @@ func (r *UnifiedRepo) getComponentsOffline(filter ComponentFilter, offset, limit
sortDir = "DESC" sortDir = "DESC"
} }
switch filter.SortField { switch filter.SortField {
case "current_price":
query = query.Order("current_price " + sortDir)
case "lot_name": case "lot_name":
query = query.Order("lot_name " + sortDir) query = query.Order("lot_name " + sortDir)
default: default:
@@ -112,9 +106,8 @@ func (r *UnifiedRepo) getComponentsOffline(filter ComponentFilter, offset, limit
result := make([]models.LotMetadata, len(components)) result := make([]models.LotMetadata, len(components))
for i, comp := range components { for i, comp := range components {
result[i] = models.LotMetadata{ result[i] = models.LotMetadata{
LotName: comp.LotName, LotName: comp.LotName,
Model: comp.Model, Model: comp.Model,
CurrentPrice: comp.CurrentPrice,
Lot: &models.Lot{ Lot: &models.Lot{
LotName: comp.LotName, LotName: comp.LotName,
LotDescription: comp.LotDescription, LotDescription: comp.LotDescription,
@@ -138,9 +131,8 @@ func (r *UnifiedRepo) GetComponent(lotName string) (*models.LotMetadata, error)
} }
return &models.LotMetadata{ return &models.LotMetadata{
LotName: comp.LotName, LotName: comp.LotName,
Model: comp.Model, Model: comp.Model,
CurrentPrice: comp.CurrentPrice,
Lot: &models.Lot{ Lot: &models.Lot{
LotName: comp.LotName, LotName: comp.LotName,
LotDescription: comp.LotDescription, LotDescription: comp.LotDescription,

View File

@@ -53,7 +53,6 @@ type ComponentView struct {
Category string `json:"category"` Category string `json:"category"`
CategoryName string `json:"category_name"` CategoryName string `json:"category_name"`
Model string `json:"model"` Model string `json:"model"`
CurrentPrice *float64 `json:"current_price"`
PriceFreshness models.PriceFreshness `json:"price_freshness"` PriceFreshness models.PriceFreshness `json:"price_freshness"`
PopularityScore float64 `json:"popularity_score"` PopularityScore float64 `json:"popularity_score"`
Specs models.Specs `json:"specs,omitempty"` Specs models.Specs `json:"specs,omitempty"`
@@ -92,7 +91,6 @@ func (s *ComponentService) List(filter repository.ComponentFilter, page, perPage
view := ComponentView{ view := ComponentView{
LotName: c.LotName, LotName: c.LotName,
Model: c.Model, Model: c.Model,
CurrentPrice: c.CurrentPrice,
PriceFreshness: c.GetPriceFreshness(30, 60, 90, 3), PriceFreshness: c.GetPriceFreshness(30, 60, 90, 3),
PopularityScore: c.PopularityScore, PopularityScore: c.PopularityScore,
Specs: c.Specs, Specs: c.Specs,
@@ -134,7 +132,6 @@ func (s *ComponentService) GetByLotName(lotName string) (*ComponentView, error)
view := &ComponentView{ view := &ComponentView{
LotName: c.LotName, LotName: c.LotName,
Model: c.Model, Model: c.Model,
CurrentPrice: c.CurrentPrice,
PriceFreshness: c.GetPriceFreshness(30, 60, 90, 3), PriceFreshness: c.GetPriceFreshness(30, 60, 90, 3),
PopularityScore: c.PopularityScore, PopularityScore: c.PopularityScore,
Specs: c.Specs, Specs: c.Specs,

View File

@@ -4,6 +4,8 @@ import (
"bytes" "bytes"
"encoding/csv" "encoding/csv"
"fmt" "fmt"
"io"
"strings"
"time" "time"
"git.mchus.pro/mchus/quoteforge/internal/config" "git.mchus.pro/mchus/quoteforge/internal/config"
@@ -40,14 +42,21 @@ type ExportItem struct {
TotalPrice float64 TotalPrice float64
} }
func (s *ExportService) ToCSV(data *ExportData) ([]byte, error) { func (s *ExportService) ToCSV(w io.Writer, data *ExportData) error {
var buf bytes.Buffer // Write UTF-8 BOM for Excel compatibility
w := csv.NewWriter(&buf) if _, err := w.Write([]byte{0xEF, 0xBB, 0xBF}); err != nil {
return fmt.Errorf("failed to write BOM: %w", err)
}
csvWriter := csv.NewWriter(w)
// Use semicolon as delimiter for Russian Excel locale
csvWriter.Comma = ';'
defer csvWriter.Flush()
// Header // Header
headers := []string{"Артикул", "Описание", "Категория", "Количество", "Цена за единицу", "Сумма"} headers := []string{"Артикул", "Описание", "Категория", "Количество", "Цена за единицу", "Сумма"}
if err := w.Write(headers); err != nil { if err := csvWriter.Write(headers); err != nil {
return nil, err return fmt.Errorf("failed to write header: %w", err)
} }
// Get category hierarchy for sorting // Get category hierarchy for sorting
@@ -90,21 +99,35 @@ func (s *ExportService) ToCSV(data *ExportData) ([]byte, error) {
item.Description, item.Description,
item.Category, item.Category,
fmt.Sprintf("%d", item.Quantity), fmt.Sprintf("%d", item.Quantity),
fmt.Sprintf("%.2f", item.UnitPrice), strings.ReplaceAll(fmt.Sprintf("%.2f", item.UnitPrice), ".", ","),
fmt.Sprintf("%.2f", item.TotalPrice), strings.ReplaceAll(fmt.Sprintf("%.2f", item.TotalPrice), ".", ","),
} }
if err := w.Write(row); err != nil { if err := csvWriter.Write(row); err != nil {
return nil, err return fmt.Errorf("failed to write row: %w", err)
} }
} }
// Total row // Total row
if err := w.Write([]string{"", "", "", "", "ИТОГО:", fmt.Sprintf("%.2f", data.Total)}); err != nil { totalStr := strings.ReplaceAll(fmt.Sprintf("%.2f", data.Total), ".", ",")
return nil, err if err := csvWriter.Write([]string{"", "", "", "", "ИТОГО:", totalStr}); err != nil {
return fmt.Errorf("failed to write total row: %w", err)
} }
w.Flush() csvWriter.Flush()
return buf.Bytes(), w.Error() if err := csvWriter.Error(); err != nil {
return fmt.Errorf("csv writer error: %w", err)
}
return nil
}
// ToCSVBytes is a backward-compatible wrapper that returns CSV data as bytes
func (s *ExportService) ToCSVBytes(data *ExportData) ([]byte, error) {
var buf bytes.Buffer
if err := s.ToCSV(&buf, data); err != nil {
return nil, err
}
return buf.Bytes(), nil
} }
func (s *ExportService) ConfigToExportData(config *models.Configuration, componentService *ComponentService) *ExportData { func (s *ExportService) ConfigToExportData(config *models.Configuration, componentService *ComponentService) *ExportData {

View File

@@ -0,0 +1,343 @@
package services
import (
"bytes"
"encoding/csv"
"io"
"testing"
"time"
"git.mchus.pro/mchus/quoteforge/internal/config"
)
func TestToCSV_UTF8BOM(t *testing.T) {
svc := NewExportService(config.ExportConfig{}, nil)
data := &ExportData{
Name: "Test",
Items: []ExportItem{
{
LotName: "LOT-001",
Description: "Test Item",
Category: "CAT",
Quantity: 1,
UnitPrice: 100.0,
TotalPrice: 100.0,
},
},
Total: 100.0,
CreatedAt: time.Now(),
}
var buf bytes.Buffer
if err := svc.ToCSV(&buf, data); err != nil {
t.Fatalf("ToCSV failed: %v", err)
}
csvBytes := buf.Bytes()
if len(csvBytes) < 3 {
t.Fatalf("CSV too short to contain BOM")
}
// Check UTF-8 BOM: 0xEF 0xBB 0xBF
expectedBOM := []byte{0xEF, 0xBB, 0xBF}
actualBOM := csvBytes[:3]
if bytes.Compare(actualBOM, expectedBOM) != 0 {
t.Errorf("UTF-8 BOM mismatch. Expected %v, got %v", expectedBOM, actualBOM)
}
}
func TestToCSV_SemicolonDelimiter(t *testing.T) {
svc := NewExportService(config.ExportConfig{}, nil)
data := &ExportData{
Name: "Test",
Items: []ExportItem{
{
LotName: "LOT-001",
Description: "Test Item",
Category: "CAT",
Quantity: 2,
UnitPrice: 100.50,
TotalPrice: 201.00,
},
},
Total: 201.00,
CreatedAt: time.Now(),
}
var buf bytes.Buffer
if err := svc.ToCSV(&buf, data); err != nil {
t.Fatalf("ToCSV failed: %v", err)
}
// Skip BOM and read CSV with semicolon delimiter
csvBytes := buf.Bytes()
reader := csv.NewReader(bytes.NewReader(csvBytes[3:]))
reader.Comma = ';'
// Read header
header, err := reader.Read()
if err != nil {
t.Fatalf("Failed to read header: %v", err)
}
if len(header) != 6 {
t.Errorf("Expected 6 columns, got %d", len(header))
}
expectedHeader := []string{"Артикул", "Описание", "Категория", "Количество", "Цена за единицу", "Сумма"}
for i, col := range expectedHeader {
if i < len(header) && header[i] != col {
t.Errorf("Column %d: expected %q, got %q", i, col, header[i])
}
}
// Read item row
itemRow, err := reader.Read()
if err != nil {
t.Fatalf("Failed to read item row: %v", err)
}
if itemRow[0] != "LOT-001" {
t.Errorf("Lot name mismatch: expected LOT-001, got %s", itemRow[0])
}
if itemRow[3] != "2" {
t.Errorf("Quantity mismatch: expected 2, got %s", itemRow[3])
}
if itemRow[4] != "100,50" {
t.Errorf("Unit price mismatch: expected 100,50, got %s", itemRow[4])
}
}
func TestToCSV_TotalRow(t *testing.T) {
svc := NewExportService(config.ExportConfig{}, nil)
data := &ExportData{
Name: "Test",
Items: []ExportItem{
{
LotName: "LOT-001",
Description: "Item 1",
Category: "CAT",
Quantity: 1,
UnitPrice: 100.0,
TotalPrice: 100.0,
},
{
LotName: "LOT-002",
Description: "Item 2",
Category: "CAT",
Quantity: 2,
UnitPrice: 50.0,
TotalPrice: 100.0,
},
},
Total: 200.0,
CreatedAt: time.Now(),
}
var buf bytes.Buffer
if err := svc.ToCSV(&buf, data); err != nil {
t.Fatalf("ToCSV failed: %v", err)
}
csvBytes := buf.Bytes()
reader := csv.NewReader(bytes.NewReader(csvBytes[3:]))
reader.Comma = ';'
// Skip header and item rows
reader.Read()
reader.Read()
reader.Read()
// Read total row
totalRow, err := reader.Read()
if err != nil {
t.Fatalf("Failed to read total row: %v", err)
}
// Total row should have "ИТОГО:" in position 4 and total value in position 5
if totalRow[4] != "ИТОГО:" {
t.Errorf("Expected 'ИТОГО:' in column 4, got %q", totalRow[4])
}
if totalRow[5] != "200,00" {
t.Errorf("Expected total 200,00, got %s", totalRow[5])
}
}
func TestToCSV_CategorySorting(t *testing.T) {
// Test category sorting without category repo (items maintain original order)
svc := NewExportService(config.ExportConfig{}, nil)
data := &ExportData{
Name: "Test",
Items: []ExportItem{
{
LotName: "LOT-001",
Category: "CAT-A",
Quantity: 1,
UnitPrice: 100.0,
TotalPrice: 100.0,
},
{
LotName: "LOT-002",
Category: "CAT-C",
Quantity: 1,
UnitPrice: 100.0,
TotalPrice: 100.0,
},
{
LotName: "LOT-003",
Category: "CAT-B",
Quantity: 1,
UnitPrice: 100.0,
TotalPrice: 100.0,
},
},
Total: 300.0,
CreatedAt: time.Now(),
}
var buf bytes.Buffer
if err := svc.ToCSV(&buf, data); err != nil {
t.Fatalf("ToCSV failed: %v", err)
}
csvBytes := buf.Bytes()
reader := csv.NewReader(bytes.NewReader(csvBytes[3:]))
reader.Comma = ';'
// Skip header
reader.Read()
// Without category repo, items maintain original order
row1, _ := reader.Read()
if row1[0] != "LOT-001" {
t.Errorf("Expected LOT-001 first, got %s", row1[0])
}
row2, _ := reader.Read()
if row2[0] != "LOT-002" {
t.Errorf("Expected LOT-002 second, got %s", row2[0])
}
row3, _ := reader.Read()
if row3[0] != "LOT-003" {
t.Errorf("Expected LOT-003 third, got %s", row3[0])
}
}
func TestToCSV_EmptyData(t *testing.T) {
svc := NewExportService(config.ExportConfig{}, nil)
data := &ExportData{
Name: "Test",
Items: []ExportItem{},
Total: 0.0,
CreatedAt: time.Now(),
}
var buf bytes.Buffer
if err := svc.ToCSV(&buf, data); err != nil {
t.Fatalf("ToCSV failed: %v", err)
}
csvBytes := buf.Bytes()
reader := csv.NewReader(bytes.NewReader(csvBytes[3:]))
reader.Comma = ';'
// Should have header and total row
header, err := reader.Read()
if err != nil {
t.Fatalf("Failed to read header: %v", err)
}
if len(header) != 6 {
t.Errorf("Expected 6 columns, got %d", len(header))
}
totalRow, err := reader.Read()
if err != nil {
t.Fatalf("Failed to read total row: %v", err)
}
if totalRow[4] != "ИТОГО:" {
t.Errorf("Expected ИТОГО: in total row, got %s", totalRow[4])
}
}
func TestToCSVBytes_BackwardCompat(t *testing.T) {
svc := NewExportService(config.ExportConfig{}, nil)
data := &ExportData{
Name: "Test",
Items: []ExportItem{
{
LotName: "LOT-001",
Description: "Test Item",
Category: "CAT",
Quantity: 1,
UnitPrice: 100.0,
TotalPrice: 100.0,
},
},
Total: 100.0,
CreatedAt: time.Now(),
}
csvBytes, err := svc.ToCSVBytes(data)
if err != nil {
t.Fatalf("ToCSVBytes failed: %v", err)
}
if len(csvBytes) < 3 {
t.Fatalf("CSV bytes too short")
}
// Verify BOM is present
expectedBOM := []byte{0xEF, 0xBB, 0xBF}
actualBOM := csvBytes[:3]
if bytes.Compare(actualBOM, expectedBOM) != 0 {
t.Errorf("UTF-8 BOM mismatch in ToCSVBytes")
}
}
func TestToCSV_WriterError(t *testing.T) {
svc := NewExportService(config.ExportConfig{}, nil)
data := &ExportData{
Name: "Test",
Items: []ExportItem{
{
LotName: "LOT-001",
Description: "Test",
Category: "CAT",
Quantity: 1,
UnitPrice: 100.0,
TotalPrice: 100.0,
},
},
Total: 100.0,
CreatedAt: time.Now(),
}
// Use a failing writer
failingWriter := &failingWriter{}
if err := svc.ToCSV(failingWriter, data); err == nil {
t.Errorf("Expected error from failing writer, got nil")
}
}
// failingWriter always returns an error
type failingWriter struct{}
func (fw *failingWriter) Write(p []byte) (int, error) {
return 0, io.EOF
}

View File

@@ -347,7 +347,7 @@ func (s *LocalConfigurationService) RefreshPrices(uuid string, ownerUsername str
} }
latestPricelist, latestErr := s.localDB.GetLatestLocalPricelist() latestPricelist, latestErr := s.localDB.GetLatestLocalPricelist()
// Update prices for all items // Update prices for all items from pricelist
updatedItems := make(localdb.LocalConfigItems, len(localCfg.Items)) updatedItems := make(localdb.LocalConfigItems, len(localCfg.Items))
for i, item := range localCfg.Items { for i, item := range localCfg.Items {
if latestErr == nil && latestPricelist != nil { if latestErr == nil && latestPricelist != nil {
@@ -362,20 +362,8 @@ func (s *LocalConfigurationService) RefreshPrices(uuid string, ownerUsername str
} }
} }
// Fallback to current component price from local cache // Keep original item if price not found in pricelist
component, err := s.localDB.GetLocalComponent(item.LotName) updatedItems[i] = item
if err != nil || component.CurrentPrice == nil {
// Keep original item if component not found or no price available
updatedItems[i] = item
continue
}
// Update item with current price from local cache
updatedItems[i] = localdb.LocalConfigItem{
LotName: item.LotName,
Quantity: item.Quantity,
UnitPrice: *component.CurrentPrice,
}
} }
// Update configuration // Update configuration
@@ -672,7 +660,7 @@ func (s *LocalConfigurationService) RefreshPricesNoAuth(uuid string) (*models.Co
} }
latestPricelist, latestErr := s.localDB.GetLatestLocalPricelist() latestPricelist, latestErr := s.localDB.GetLatestLocalPricelist()
// Update prices for all items // Update prices for all items from pricelist
updatedItems := make(localdb.LocalConfigItems, len(localCfg.Items)) updatedItems := make(localdb.LocalConfigItems, len(localCfg.Items))
for i, item := range localCfg.Items { for i, item := range localCfg.Items {
if latestErr == nil && latestPricelist != nil { if latestErr == nil && latestPricelist != nil {
@@ -687,20 +675,8 @@ func (s *LocalConfigurationService) RefreshPricesNoAuth(uuid string) (*models.Co
} }
} }
// Fallback to current component price from local cache // Keep original item if price not found in pricelist
component, err := s.localDB.GetLocalComponent(item.LotName) updatedItems[i] = item
if err != nil || component.CurrentPrice == nil {
// Keep original item if component not found or no price available
updatedItems[i] = item
continue
}
// Update item with current price from local cache
updatedItems[i] = localdb.LocalConfigItem{
LotName: item.LotName,
Quantity: item.Quantity,
UnitPrice: *component.CurrentPrice,
}
} }
// Update configuration // Update configuration

View File

@@ -78,6 +78,7 @@ type QuoteRequest struct {
LotName string `json:"lot_name"` LotName string `json:"lot_name"`
Quantity int `json:"quantity"` Quantity int `json:"quantity"`
} `json:"items"` } `json:"items"`
PricelistID *uint `json:"pricelist_id,omitempty"` // Optional: use specific pricelist for pricing
} }
type PriceLevelsRequest struct { type PriceLevelsRequest struct {
@@ -123,6 +124,16 @@ func (s *QuoteService) ValidateAndCalculate(req *QuoteRequest) (*QuoteValidation
Warnings: make([]string, 0), Warnings: make([]string, 0),
} }
// Determine which pricelist to use for pricing
pricelistID := req.PricelistID
if pricelistID == nil || *pricelistID == 0 {
// By default, use latest estimate pricelist
latestPricelist, err := s.localDB.GetLatestLocalPricelistBySource("estimate")
if err == nil && latestPricelist != nil {
pricelistID = &latestPricelist.ServerID
}
}
var total float64 var total float64
for _, reqItem := range req.Items { for _, reqItem := range req.Items {
localComp, err := s.localDB.GetLocalComponent(reqItem.LotName) localComp, err := s.localDB.GetLocalComponent(reqItem.LotName)
@@ -142,13 +153,19 @@ func (s *QuoteService) ValidateAndCalculate(req *QuoteRequest) (*QuoteValidation
TotalPrice: 0, TotalPrice: 0,
} }
if localComp.CurrentPrice != nil && *localComp.CurrentPrice > 0 { // Get price from pricelist_items
item.UnitPrice = *localComp.CurrentPrice if pricelistID != nil {
item.TotalPrice = *localComp.CurrentPrice * float64(reqItem.Quantity) price, found := s.lookupPriceByPricelistID(*pricelistID, reqItem.LotName)
item.HasPrice = true if found && price > 0 {
total += item.TotalPrice item.UnitPrice = price
item.TotalPrice = price * float64(reqItem.Quantity)
item.HasPrice = true
total += item.TotalPrice
} else {
result.Warnings = append(result.Warnings, "No price available for: "+reqItem.LotName)
}
} else { } else {
result.Warnings = append(result.Warnings, "No price available for: "+reqItem.LotName) result.Warnings = append(result.Warnings, "No pricelist available for: "+reqItem.LotName)
} }
result.Items = append(result.Items, item) result.Items = append(result.Items, item)

View File

@@ -189,33 +189,54 @@ func listActiveClientMigrations(db *gorm.DB) ([]clientLocalMigration, error) {
} }
func ensureClientMigrationRegistryTable(db *gorm.DB) error { func ensureClientMigrationRegistryTable(db *gorm.DB) error {
if err := db.Exec(` // Check if table exists instead of trying to create (avoids permission issues)
CREATE TABLE IF NOT EXISTS qt_client_local_migrations ( if !tableExists(db, "qt_client_local_migrations") {
id VARCHAR(128) NOT NULL, if err := db.Exec(`
name VARCHAR(255) NOT NULL, CREATE TABLE IF NOT EXISTS qt_client_local_migrations (
sql_text LONGTEXT NOT NULL, id VARCHAR(128) NOT NULL,
checksum VARCHAR(128) NOT NULL, name VARCHAR(255) NOT NULL,
min_app_version VARCHAR(64) NULL, sql_text LONGTEXT NOT NULL,
order_no INT NOT NULL DEFAULT 0, checksum VARCHAR(128) NOT NULL,
is_active TINYINT(1) NOT NULL DEFAULT 1, min_app_version VARCHAR(64) NULL,
created_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP, order_no INT NOT NULL DEFAULT 0,
PRIMARY KEY (id), is_active TINYINT(1) NOT NULL DEFAULT 1,
INDEX idx_qt_client_local_migrations_active_order (is_active, order_no, created_at) created_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
) PRIMARY KEY (id),
`).Error; err != nil { INDEX idx_qt_client_local_migrations_active_order (is_active, order_no, created_at)
return err )
`).Error; err != nil {
return fmt.Errorf("create qt_client_local_migrations table: %w", err)
}
} }
return db.Exec(`
CREATE TABLE IF NOT EXISTS qt_client_schema_state ( if !tableExists(db, "qt_client_schema_state") {
username VARCHAR(100) NOT NULL, if err := db.Exec(`
last_applied_migration_id VARCHAR(128) NULL, CREATE TABLE IF NOT EXISTS qt_client_schema_state (
app_version VARCHAR(64) NULL, username VARCHAR(100) NOT NULL,
last_checked_at DATETIME NOT NULL, last_applied_migration_id VARCHAR(128) NULL,
updated_at DATETIME NOT NULL, app_version VARCHAR(64) NULL,
PRIMARY KEY (username), last_checked_at DATETIME NOT NULL,
INDEX idx_qt_client_schema_state_checked (last_checked_at) updated_at DATETIME NOT NULL,
) PRIMARY KEY (username),
`).Error INDEX idx_qt_client_schema_state_checked (last_checked_at)
)
`).Error; err != nil {
return fmt.Errorf("create qt_client_schema_state table: %w", err)
}
}
return nil
}
func tableExists(db *gorm.DB, tableName string) bool {
var count int64
// For MariaDB/MySQL, check information_schema
if err := db.Raw(`
SELECT COUNT(*) FROM information_schema.TABLES
WHERE TABLE_SCHEMA = DATABASE() AND TABLE_NAME = ?
`, tableName).Scan(&count).Error; err != nil {
return false
}
return count > 0
} }
func (s *Service) applyMissingRemoteMigrations(migrations []clientLocalMigration) error { func (s *Service) applyMissingRemoteMigrations(migrations []clientLocalMigration) error {

View File

@@ -346,17 +346,10 @@ func (s *Service) SyncPricelists() (int, error) {
} }
synced := 0 synced := 0
var latestEstimateLocalID uint
var latestEstimateCreatedAt time.Time
for _, pl := range serverPricelists { for _, pl := range serverPricelists {
// Check if pricelist already exists locally // Check if pricelist already exists locally
existing, _ := s.localDB.GetLocalPricelistByServerID(pl.ID) existing, _ := s.localDB.GetLocalPricelistByServerID(pl.ID)
if existing != nil { if existing != nil {
// Track latest estimate pricelist by created_at for component refresh.
if pl.Source == string(models.PricelistSourceEstimate) && (latestEstimateCreatedAt.IsZero() || pl.CreatedAt.After(latestEstimateCreatedAt)) {
latestEstimateCreatedAt = pl.CreatedAt
latestEstimateLocalID = existing.ID
}
continue continue
} }
@@ -385,10 +378,6 @@ func (s *Service) SyncPricelists() (int, error) {
slog.Debug("synced pricelist with items", "version", pl.Version, "items", itemCount) slog.Debug("synced pricelist with items", "version", pl.Version, "items", itemCount)
} }
if pl.Source == string(models.PricelistSourceEstimate) && (latestEstimateCreatedAt.IsZero() || pl.CreatedAt.After(latestEstimateCreatedAt)) {
latestEstimateCreatedAt = pl.CreatedAt
latestEstimateLocalID = localPL.ID
}
synced++ synced++
} }
@@ -399,16 +388,6 @@ func (s *Service) SyncPricelists() (int, error) {
slog.Info("deleted stale local pricelists", "deleted", removed) slog.Info("deleted stale local pricelists", "deleted", removed)
} }
// Update component prices from latest estimate pricelist only.
if latestEstimateLocalID > 0 {
updated, err := s.localDB.UpdateComponentPricesFromPricelist(latestEstimateLocalID)
if err != nil {
slog.Warn("failed to update component prices from pricelist", "error", err)
} else {
slog.Info("updated component prices from latest pricelist", "updated", updated)
}
}
// Update last sync time // Update last sync time
s.localDB.SetLastSyncTime(time.Now()) s.localDB.SetLastSyncTime(time.Now())
s.RecordSyncHeartbeat() s.RecordSyncHeartbeat()
@@ -553,24 +532,34 @@ func (s *Service) listConnectedDBUsers(mariaDB *gorm.DB) (map[string]struct{}, e
} }
func ensureUserSyncStatusTable(db *gorm.DB) error { func ensureUserSyncStatusTable(db *gorm.DB) error {
if err := db.Exec(` // Check if table exists instead of trying to create (avoids permission issues)
CREATE TABLE IF NOT EXISTS qt_pricelist_sync_status ( if !tableExists(db, "qt_pricelist_sync_status") {
username VARCHAR(100) NOT NULL, if err := db.Exec(`
last_sync_at DATETIME NOT NULL, CREATE TABLE IF NOT EXISTS qt_pricelist_sync_status (
updated_at DATETIME NOT NULL, username VARCHAR(100) NOT NULL,
app_version VARCHAR(64) NULL, last_sync_at DATETIME NOT NULL,
PRIMARY KEY (username), updated_at DATETIME NOT NULL,
INDEX idx_qt_pricelist_sync_status_last_sync (last_sync_at) app_version VARCHAR(64) NULL,
) PRIMARY KEY (username),
`).Error; err != nil { INDEX idx_qt_pricelist_sync_status_last_sync (last_sync_at)
return err )
`).Error; err != nil {
return fmt.Errorf("create qt_pricelist_sync_status table: %w", err)
}
} }
// Backward compatibility for environments where table was created without app_version. // Backward compatibility for environments where table was created without app_version.
return db.Exec(` // Only try to add column if table exists.
ALTER TABLE qt_pricelist_sync_status if tableExists(db, "qt_pricelist_sync_status") {
ADD COLUMN IF NOT EXISTS app_version VARCHAR(64) NULL if err := db.Exec(`
`).Error ALTER TABLE qt_pricelist_sync_status
ADD COLUMN IF NOT EXISTS app_version VARCHAR(64) NULL
`).Error; err != nil {
// Log but don't fail if alter fails (column might already exist)
slog.Debug("failed to add app_version column", "error", err)
}
}
return nil
} }
// SyncPricelistItems synchronizes items for a specific pricelist // SyncPricelistItems synchronizes items for a specific pricelist

BIN
qfs

Binary file not shown.

72
releases/memory/v1.2.1.md Normal file
View File

@@ -0,0 +1,72 @@
# v1.2.1 Release Notes
**Date:** 2026-02-09
**Changes since v1.2.0:** 2 commits
## Summary
Fixed configurator component substitution by updating to work with new pricelist-based pricing model. Addresses regression from v1.2.0 refactor that removed `CurrentPrice` field from components.
## Commits
### 1. Refactor: Remove CurrentPrice from local_components (5984a57)
**Type:** Refactor
**Files Changed:** 11 files, +167 insertions, -194 deletions
#### Overview
Transitioned from component-based pricing to pricelist-based pricing model:
- Removed `CurrentPrice` and `SyncedAt` from LocalComponent (metadata-only now)
- Added `WarehousePricelistID` and `CompetitorPricelistID` to LocalConfiguration
- Removed 2 unused methods: UpdateComponentPricesFromPricelist, EnsureComponentPricesFromPricelists
#### Key Changes
- **Data Model:**
- LocalComponent: now stores only metadata (LotName, LotDescription, Category, Model)
- LocalConfiguration: added warehouse and competitor pricelist references
- **Migrations:**
- drop_component_unused_fields - removes CurrentPrice, SyncedAt columns
- add_warehouse_competitor_pricelists - adds new pricelist fields
- **Quote Calculation:**
- Updated to use pricelist_items instead of component.CurrentPrice
- Added PricelistID field to QuoteRequest
- Maintains offline-first behavior
- **API:**
- Removed CurrentPrice from ComponentView
- Components API no longer returns pricing
### 2. Fix: Load component prices via API (acf7c8a)
**Type:** Bug Fix
**Files Changed:** 1 file (web/templates/index.html), +66 insertions, -12 deletions
#### Problem
After v1.2.0 refactor, the configurator's autocomplete was filtering out all components because it checked for the removed `current_price` field on component objects.
#### Solution
Implemented on-demand price loading via API:
- Added `ensurePricesLoaded()` function to fetch prices from `/api/quote/price-levels`
- Added `componentPricesCache` to cache loaded prices in memory
- Updated all 3 autocomplete modes (single, multi, section) to load prices when input is focused
- Changed price validation from `c.current_price` to `hasComponentPrice(lot_name)`
- Updated cart item creation to use cached API prices
#### Impact
- Components without prices are still filtered out (as required)
- Price checks now use API data instead of removed database field
- Frontend loads prices on-demand for better performance
## Testing Notes
- ✅ Configurator component substitution now works
- ✅ Prices load correctly from pricelist
- ✅ Offline mode still supported (prices cached after initial load)
- ✅ Multi-pricelist support functional (estimate/warehouse/competitor)
## Known Issues
None
## Migration Path
No database migration needed from v1.2.0 - migrations were applied in v1.2.0 release.
## Breaking Changes
None for end users. Internal: `ComponentView` no longer includes `CurrentPrice` in API responses.

59
releases/memory/v1.2.2.md Normal file
View File

@@ -0,0 +1,59 @@
# Release v1.2.2 (2026-02-09)
## Summary
Fixed CSV export filename inconsistency where project names weren't being resolved correctly. Standardized export format across both manual exports and project configuration exports to use `YYYY-MM-DD (project_name) config_name BOM.csv`.
## Commits
- `8f596ce` fix: standardize CSV export filename format to use project name
## Changes
### CSV Export Filename Standardization
**Problem:**
- ExportCSV and ExportConfigCSV had inconsistent filename formats
- Project names sometimes fell back to config names when not explicitly provided
- Export timestamps didn't reflect actual price update time
**Solution:**
- Unified format: `YYYY-MM-DD (project_name) config_name BOM.csv`
- Both export paths now use PriceUpdatedAt if available, otherwise CreatedAt
- Project name resolved from ProjectUUID via ProjectService for both paths
- Frontend passes project_uuid context when exporting
**Technical Details:**
Backend:
- Added `ProjectUUID` field to `ExportRequest` struct in handlers/export.go
- Updated ExportCSV to look up project name from ProjectUUID using ProjectService
- Ensured ExportConfigCSV gets project name from config's ProjectUUID
- Both use CreatedAt (for ExportCSV) or PriceUpdatedAt/CreatedAt (for ExportConfigCSV)
Frontend:
- Added `projectUUID` and `projectName` state variables in index.html
- Load and store projectUUID when configuration is loaded
- Pass `project_uuid` in JSON body for both export requests
## Files Modified
- `internal/handlers/export.go` - Project name resolution and ExportRequest update
- `internal/handlers/export_test.go` - Updated mock initialization with projectService param
- `cmd/qfs/main.go` - Pass projectService to ExportHandler constructor
- `web/templates/index.html` - Add projectUUID tracking and export payload updates
## Testing Notes
✅ All existing tests updated and passing
✅ Code builds without errors
✅ Export filename now includes correct project name
✅ Works for both form-based and project-based exports
## Breaking Changes
None - API response format unchanged, only filename generation updated.
## Known Issues
None identified.

View File

@@ -0,0 +1,89 @@
# QuoteForge v1.2.1
**Дата релиза:** 2026-02-09
**Тег:** `v1.2.1`
**GitHub:** https://git.mchus.pro/mchus/QuoteForge/releases/tag/v1.2.1
## Резюме
Быстрый патч-релиз, исправляющий регрессию в конфигураторе после рефактора v1.2.0. После удаления поля `CurrentPrice` из компонентов, autocomplete перестал показывать компоненты. Теперь используется на-demand загрузка цен через API.
## Что исправлено
### 🐛 Configurator Component Substitution (acf7c8a)
- **Проблема:** После рефактора в v1.2.0, autocomplete фильтровал ВСЕ компоненты, потому что проверял удаленное поле `current_price`
- **Решение:** Загрузка цен на-demand через `/api/quote/price-levels`
- Добавлен `componentPricesCache` для кэширования цен в памяти
- Функция `ensurePricesLoaded()` загружает цены при фокусе на поле поиска
- Все 3 режима autocomplete (single, multi, section) обновлены
- Компоненты без цен по-прежнему фильтруются (как требуется), но проверка использует API
- **Затронутые файлы:** `web/templates/index.html` (+66 строк, -12 строк)
## История v1.2.0 → v1.2.1
Всего коммитов: **2**
| Хеш | Автор | Сообщение |
|-----|-------|-----------|
| `acf7c8a` | Claude | fix: load component prices via API instead of removed current_price field |
| `5984a57` | Claude | refactor: remove CurrentPrice from local_components and transition to pricelist-based pricing |
## Тестирование
✅ Configurator component substitution работает
✅ Цены загружаются корректно из pricelist
✅ Offline режим поддерживается (цены кэшируются после первой загрузки)
✅ Multi-pricelist поддержка функциональна (estimate/warehouse/competitor)
## Breaking Changes
Нет критических изменений для конечных пользователей.
⚠️ **Для разработчиков:** `ComponentView` API больше не возвращает `CurrentPrice`.
## Миграция
Не требуется миграция БД — все миграции были применены в v1.2.0.
## Установка
### macOS
```bash
# Скачать и распаковать
tar xzf qfs-v1.2.1-darwin-arm64.tar.gz # для Apple Silicon
# или
tar xzf qfs-v1.2.1-darwin-amd64.tar.gz # для Intel Mac
# Снять ограничение Gatekeeper (если требуется)
xattr -d com.apple.quarantine ./qfs
# Запустить
./qfs
```
### Linux
```bash
tar xzf qfs-v1.2.1-linux-amd64.tar.gz
./qfs
```
### Windows
```bash
# Распаковать qfs-v1.2.1-windows-amd64.zip
# Запустить qfs.exe
```
## Известные проблемы
Нет известных проблем на момент релиза.
## Поддержка
По вопросам обращайтесь: [@mchus](https://git.mchus.pro/mchus)
---
*Отправлено с ❤️ через Claude Code*

56
scripts/check-secrets.sh Executable file
View File

@@ -0,0 +1,56 @@
#!/usr/bin/env bash
set -euo pipefail
if ! git rev-parse --git-dir >/dev/null 2>&1; then
echo "Not inside a git repository."
exit 1
fi
if ! command -v rg >/dev/null 2>&1; then
echo "ripgrep (rg) is required for secret scanning."
exit 1
fi
staged_files=()
while IFS= read -r file; do
staged_files+=("$file")
done < <(git diff --cached --name-only --diff-filter=ACMRTUXB)
if [ "${#staged_files[@]}" -eq 0 ]; then
exit 0
fi
secret_pattern='AKIA[0-9A-Z]{16}|ASIA[0-9A-Z]{16}|ghp_[A-Za-z0-9]{36}|github_pat_[A-Za-z0-9_]{20,}|xox[baprs]-[A-Za-z0-9-]{10,}|AIza[0-9A-Za-z_-]{35}|-----BEGIN (RSA|OPENSSH|EC|DSA|PRIVATE) KEY-----|(?i)(password|passwd|pwd|secret|token|api[_-]?key|jwt_secret)\s*[:=]\s*["'"'"'][^"'"'"'\s]{8,}["'"'"']'
allow_pattern='CHANGE_ME|REDACTED|PLACEHOLDER|EXAMPLE|example|<[^>]+>'
found=0
for file in "${staged_files[@]}"; do
case "$file" in
dist/*|*.png|*.jpg|*.jpeg|*.gif|*.webp|*.pdf|*.zip|*.gz|*.exe|*.dll|*.so|*.dylib)
continue
;;
esac
if ! content="$(git show ":$file" 2>/dev/null)"; then
continue
fi
hits="$(printf '%s' "$content" | rg -n --no-heading -e "$secret_pattern" || true)"
if [ -n "$hits" ]; then
filtered="$(printf '%s\n' "$hits" | rg -v -e "$allow_pattern" || true)"
if [ -n "$filtered" ]; then
echo "Potential secret found in staged file: $file"
printf '%s\n' "$filtered"
found=1
fi
fi
done
if [ "$found" -ne 0 ]; then
echo
echo "Commit blocked: remove or redact secrets before committing."
exit 1
fi
exit 0

View File

@@ -285,6 +285,14 @@
showToast(successMessage, 'success'); showToast(successMessage, 'success');
// Update last sync time - removed since dropdown is gone // Update last sync time - removed since dropdown is gone
// loadLastSyncTime(); // loadLastSyncTime();
// Dispatch custom event for pages to react to sync completion
window.dispatchEvent(new CustomEvent('sync-completed', {
detail: {
endpoint: endpoint,
data: data
}
}));
} else if (resp.status === 423) { } else if (resp.status === 423) {
const reason = data.reason_text || data.error || 'Синхронизация заблокирована.'; const reason = data.reason_text || data.error || 'Синхронизация заблокирована.';
showToast(reason, 'error'); showToast(reason, 'error');

View File

@@ -4,13 +4,10 @@
<div class="space-y-4"> <div class="space-y-4">
<h1 class="text-2xl font-bold">Мои конфигурации</h1> <h1 class="text-2xl font-bold">Мои конфигурации</h1>
<div id="action-buttons" class="mt-4 grid grid-cols-1 sm:grid-cols-2 gap-3"> <div id="action-buttons" class="mt-4">
<button onclick="openCreateModal()" class="py-3 bg-blue-600 text-white rounded-lg hover:bg-blue-700 font-medium"> <button onclick="openCreateModal()" class="w-full sm:w-auto py-3 px-6 bg-blue-600 text-white rounded-lg hover:bg-blue-700 font-medium">
+ Создать новую конфигурацию + Создать новую конфигурацию
</button> </button>
<button id="import-configs-btn" onclick="importConfigsFromServer()" class="py-3 bg-emerald-600 text-white rounded-lg hover:bg-emerald-700 font-medium">
Импорт с сервера
</button>
</div> </div>
<div class="mt-4 inline-flex rounded-lg border border-gray-200 overflow-hidden"> <div class="mt-4 inline-flex rounded-lg border border-gray-200 overflow-hidden">
@@ -57,12 +54,12 @@
<div class="space-y-4"> <div class="space-y-4">
<div> <div>
<label class="block text-sm font-medium text-gray-700 mb-1">Номер Opportunity</label> <label class="block text-sm font-medium text-gray-700 mb-1">Название конфигурации</label>
<input type="text" id="opportunity-number" placeholder="Например: OPP-2024-001" <input type="text" id="opportunity-number" placeholder="Например: Сервер для проекта X"
class="w-full px-3 py-2 border rounded focus:ring-2 focus:ring-blue-500 focus:border-blue-500"> class="w-full px-3 py-2 border rounded focus:ring-2 focus:ring-blue-500 focus:border-blue-500">
</div> </div>
<div> <div>
<label class="block text-sm font-medium text-gray-700 mb-1">Проект</label> <label class="block text-sm font-medium text-gray-700 mb-1">Код проекта</label>
<input id="create-project-input" <input id="create-project-input"
list="create-project-options" list="create-project-options"
placeholder="Начните вводить название проекта" placeholder="Начните вводить название проекта"
@@ -785,44 +782,19 @@ async function loadConfigs() {
} }
} }
async function importConfigsFromServer() {
const button = document.getElementById('import-configs-btn');
const originalText = button.textContent;
button.disabled = true;
button.textContent = 'Импорт...';
try {
const resp = await fetch('/api/configs/import', { method: 'POST' });
const data = await resp.json();
if (!resp.ok) {
alert('Ошибка импорта: ' + (data.error || 'неизвестная ошибка'));
return;
}
alert(
'Импорт завершен:\n' +
'- Новых: ' + (data.imported || 0) + '\n' +
'- Обновлено: ' + (data.updated || 0) + '\n' +
'- Пропущено (локальные изменения): ' + (data.skipped || 0)
);
currentPage = 1;
await loadConfigs();
} catch (e) {
alert('Ошибка импорта с сервера');
} finally {
button.disabled = false;
button.textContent = originalText;
}
}
document.addEventListener('DOMContentLoaded', function() { document.addEventListener('DOMContentLoaded', function() {
applyStatusModeUI(); applyStatusModeUI();
loadProjectsForConfigUI().then(loadConfigs); loadProjectsForConfigUI().then(loadConfigs);
// Load latest pricelist version for badge // Load latest pricelist version for badge
loadLatestPricelistVersion(); loadLatestPricelistVersion();
// Listen for sync completion events from navbar
window.addEventListener('sync-completed', function(e) {
// Reset pagination and reload configurations list
currentPage = 1;
loadConfigs();
});
}); });
document.getElementById('configs-search').addEventListener('input', function(e) { document.getElementById('configs-search').addEventListener('input', function(e) {
@@ -835,12 +807,17 @@ async function loadProjectsForConfigUI() {
projectsCache = []; projectsCache = [];
projectNameByUUID = {}; projectNameByUUID = {};
try { try {
const resp = await fetch('/api/projects?status=all'); // Use /api/projects/all to get all projects without pagination
const resp = await fetch('/api/projects/all');
if (!resp.ok) return; if (!resp.ok) return;
const data = await resp.json(); const data = await resp.json();
projectsCache = (data.projects || []); // data is now a simple array of {uuid, name} objects
const allProjects = Array.isArray(data) ? data : (data.projects || []);
projectsCache.forEach(project => { // For compatibility with rest of code, populate projectsCache but mainly use projectNameByUUID
projectsCache = allProjects;
allProjects.forEach(project => {
projectNameByUUID[project.uuid] = project.name; projectNameByUUID[project.uuid] = project.name;
}); });

View File

@@ -326,6 +326,8 @@ let ASSIGNED_CATEGORIES = Object.values(TAB_CONFIG)
// State // State
let configUUID = '{{.ConfigUUID}}'; let configUUID = '{{.ConfigUUID}}';
let configName = ''; let configName = '';
let projectUUID = '';
let projectName = '';
let currentTab = 'base'; let currentTab = 'base';
let allComponents = []; let allComponents = [];
let cart = []; let cart = [];
@@ -351,6 +353,8 @@ let priceLevelsRefreshTimer = null;
let warehouseStockLotsByPricelist = new Map(); let warehouseStockLotsByPricelist = new Map();
let warehouseStockLoadSeq = 0; let warehouseStockLoadSeq = 0;
let warehouseStockLoadsByPricelist = new Map(); let warehouseStockLoadsByPricelist = new Map();
let componentPricesCache = {}; // { lot_name: price } - caches prices loaded via API
let componentPricesCacheLoading = new Map(); // { category: Promise } - tracks ongoing price loads
// Autocomplete state // Autocomplete state
let autocompleteInput = null; let autocompleteInput = null;
@@ -607,6 +611,7 @@ document.addEventListener('DOMContentLoaded', async function() {
const config = await resp.json(); const config = await resp.json();
configName = config.name; configName = config.name;
projectUUID = config.project_uuid || '';
document.getElementById('config-name').textContent = config.name; document.getElementById('config-name').textContent = config.name;
document.getElementById('save-buttons').classList.remove('hidden'); document.getElementById('save-buttons').classList.remove('hidden');
@@ -1201,12 +1206,54 @@ function renderMultiSelectTabWithSections(sections) {
document.getElementById('tab-content').innerHTML = html; document.getElementById('tab-content').innerHTML = html;
} }
// Load prices for components in a category/tab via API
async function ensurePricesLoaded(components) {
if (!components || components.length === 0) return;
// Filter out components that already have prices cached
const toLoad = components.filter(c => !(c.lot_name in componentPricesCache));
if (toLoad.length === 0) return;
try {
// Use quote/price-levels API to get prices for these components
const resp = await fetch('/api/quote/price-levels', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
items: toLoad.map(c => ({ lot_name: c.lot_name, quantity: 1 })),
pricelist_ids: Object.fromEntries(
Object.entries(selectedPricelistIds)
.filter(([, id]) => typeof id === 'number' && id > 0)
)
})
});
if (resp.ok) {
const data = await resp.json();
if (data.items) {
data.items.forEach(item => {
// Cache the estimate price (or 0 if not found)
componentPricesCache[item.lot_name] = item.estimate_price || 0;
});
}
}
} catch (e) {
console.error('Failed to load component prices', e);
}
}
function hasComponentPrice(lotName) {
return lotName in componentPricesCache && componentPricesCache[lotName] > 0;
}
// Autocomplete for single select (Base tab) // Autocomplete for single select (Base tab)
function showAutocomplete(category, input) { async function showAutocomplete(category, input) {
autocompleteInput = input; autocompleteInput = input;
autocompleteCategory = category; autocompleteCategory = category;
autocompleteMode = 'single'; autocompleteMode = 'single';
autocompleteIndex = -1; autocompleteIndex = -1;
const components = getComponentsForCategory(category);
await ensurePricesLoaded(components);
filterAutocomplete(category, input.value); filterAutocomplete(category, input.value);
} }
@@ -1215,7 +1262,7 @@ function filterAutocomplete(category, search) {
const searchLower = search.toLowerCase(); const searchLower = search.toLowerCase();
autocompleteFiltered = components.filter(c => { autocompleteFiltered = components.filter(c => {
if (!c.current_price) return false; if (!hasComponentPrice(c.lot_name)) return false;
if (!isComponentAllowedByStockFilter(c)) return false; if (!isComponentAllowedByStockFilter(c)) return false;
const text = (c.lot_name + ' ' + (c.description || '')).toLowerCase(); const text = (c.lot_name + ' ' + (c.description || '')).toLowerCase();
return text.includes(searchLower); return text.includes(searchLower);
@@ -1298,12 +1345,13 @@ function selectAutocompleteItem(index) {
const qtyInput = document.getElementById('qty-' + autocompleteCategory); const qtyInput = document.getElementById('qty-' + autocompleteCategory);
const qty = parseInt(qtyInput?.value) || 1; const qty = parseInt(qtyInput?.value) || 1;
const price = componentPricesCache[comp.lot_name] || 0;
cart.push({ cart.push({
lot_name: comp.lot_name, lot_name: comp.lot_name,
quantity: qty, quantity: qty,
unit_price: comp.current_price, unit_price: price,
estimate_price: comp.current_price, estimate_price: price,
warehouse_price: null, warehouse_price: null,
competitor_price: null, competitor_price: null,
delta_wh_estimate_abs: null, delta_wh_estimate_abs: null,
@@ -1333,11 +1381,13 @@ function hideAutocomplete() {
} }
// Autocomplete for multi select tabs // Autocomplete for multi select tabs
function showAutocompleteMulti(input) { async function showAutocompleteMulti(input) {
autocompleteInput = input; autocompleteInput = input;
autocompleteCategory = null; autocompleteCategory = null;
autocompleteMode = 'multi'; autocompleteMode = 'multi';
autocompleteIndex = -1; autocompleteIndex = -1;
const components = getComponentsForTab(currentTab);
await ensurePricesLoaded(components);
filterAutocompleteMulti(input.value); filterAutocompleteMulti(input.value);
} }
@@ -1349,7 +1399,7 @@ function filterAutocompleteMulti(search) {
const addedLots = new Set(cart.map(i => i.lot_name)); const addedLots = new Set(cart.map(i => i.lot_name));
autocompleteFiltered = components.filter(c => { autocompleteFiltered = components.filter(c => {
if (!c.current_price) return false; if (!hasComponentPrice(c.lot_name)) return false;
if (addedLots.has(c.lot_name)) return false; if (addedLots.has(c.lot_name)) return false;
if (!isComponentAllowedByStockFilter(c)) return false; if (!isComponentAllowedByStockFilter(c)) return false;
const text = (c.lot_name + ' ' + (c.description || '')).toLowerCase(); const text = (c.lot_name + ' ' + (c.description || '')).toLowerCase();
@@ -1390,12 +1440,13 @@ function selectAutocompleteItemMulti(index) {
const qtyInput = document.getElementById('new-qty'); const qtyInput = document.getElementById('new-qty');
const qty = parseInt(qtyInput?.value) || 1; const qty = parseInt(qtyInput?.value) || 1;
const price = componentPricesCache[comp.lot_name] || 0;
cart.push({ cart.push({
lot_name: comp.lot_name, lot_name: comp.lot_name,
quantity: qty, quantity: qty,
unit_price: comp.current_price, unit_price: price,
estimate_price: comp.current_price, estimate_price: price,
warehouse_price: null, warehouse_price: null,
competitor_price: null, competitor_price: null,
delta_wh_estimate_abs: null, delta_wh_estimate_abs: null,
@@ -1417,11 +1468,16 @@ function selectAutocompleteItemMulti(index) {
} }
// Autocomplete for sectioned tabs (like storage with RAID and Disks sections) // Autocomplete for sectioned tabs (like storage with RAID and Disks sections)
function showAutocompleteSection(sectionId, input) { async function showAutocompleteSection(sectionId, input) {
autocompleteInput = input; autocompleteInput = input;
autocompleteCategory = sectionId; // Store section ID autocompleteCategory = sectionId; // Store section ID
autocompleteMode = 'section'; autocompleteMode = 'section';
autocompleteIndex = -1; autocompleteIndex = -1;
// Load prices for tab components
const components = getComponentsForTab(currentTab);
await ensurePricesLoaded(components);
filterAutocompleteSection(sectionId, input.value, input); filterAutocompleteSection(sectionId, input.value, input);
} }
@@ -1448,7 +1504,7 @@ function filterAutocompleteSection(sectionId, search, inputElement) {
const addedLots = new Set(cart.map(i => i.lot_name)); const addedLots = new Set(cart.map(i => i.lot_name));
autocompleteFiltered = sectionComponents.filter(c => { autocompleteFiltered = sectionComponents.filter(c => {
if (!c.current_price) return false; if (!hasComponentPrice(c.lot_name)) return false;
if (addedLots.has(c.lot_name)) return false; if (addedLots.has(c.lot_name)) return false;
if (!isComponentAllowedByStockFilter(c)) return false; if (!isComponentAllowedByStockFilter(c)) return false;
const text = (c.lot_name + ' ' + (c.description || '')).toLowerCase(); const text = (c.lot_name + ' ' + (c.description || '')).toLowerCase();
@@ -1489,12 +1545,13 @@ function selectAutocompleteItemSection(index, sectionId) {
const qtyInput = document.getElementById('new-qty-' + sectionId); const qtyInput = document.getElementById('new-qty-' + sectionId);
const qty = parseInt(qtyInput?.value) || 1; const qty = parseInt(qtyInput?.value) || 1;
const price = componentPricesCache[comp.lot_name] || 0;
cart.push({ cart.push({
lot_name: comp.lot_name, lot_name: comp.lot_name,
quantity: qty, quantity: qty,
unit_price: comp.current_price, unit_price: price,
estimate_price: comp.current_price, estimate_price: price,
warehouse_price: null, warehouse_price: null,
competitor_price: null, competitor_price: null,
delta_wh_estimate_abs: null, delta_wh_estimate_abs: null,
@@ -1716,6 +1773,14 @@ async function saveConfig(showNotification = true) {
} }
} }
// Helper function to extract filename from Content-Disposition header
function getFilenameFromResponse(resp) {
const contentDisposition = resp.headers.get('content-disposition');
if (!contentDisposition) return null;
const matches = contentDisposition.match(/filename="?([^"]+)"?/);
return matches && matches[1] ? matches[1] : null;
}
async function exportCSV() { async function exportCSV() {
if (cart.length === 0) return; if (cart.length === 0) return;
@@ -1733,14 +1798,14 @@ async function exportCSV() {
const resp = await fetch('/api/export/csv', { const resp = await fetch('/api/export/csv', {
method: 'POST', method: 'POST',
headers: {'Content-Type': 'application/json'}, headers: {'Content-Type': 'application/json'},
body: JSON.stringify({items: exportItems, name: configName}) body: JSON.stringify({items: exportItems, name: configName, project_uuid: projectUUID})
}); });
const blob = await resp.blob(); const blob = await resp.blob();
const url = window.URL.createObjectURL(blob); const url = window.URL.createObjectURL(blob);
const a = document.createElement('a'); const a = document.createElement('a');
a.href = url; a.href = url;
a.download = (configName || 'config') + '.csv'; a.download = getFilenameFromResponse(resp) || (configName || 'config') + '.csv';
a.click(); a.click();
window.URL.revokeObjectURL(url); window.URL.revokeObjectURL(url);
} catch(e) { } catch(e) {
@@ -1986,14 +2051,14 @@ async function exportCSVWithCustomPrice() {
const resp = await fetch('/api/export/csv', { const resp = await fetch('/api/export/csv', {
method: 'POST', method: 'POST',
headers: {'Content-Type': 'application/json'}, headers: {'Content-Type': 'application/json'},
body: JSON.stringify({items: adjustedCart, name: configName}) body: JSON.stringify({items: adjustedCart, name: configName, project_uuid: projectUUID})
}); });
const blob = await resp.blob(); const blob = await resp.blob();
const url = window.URL.createObjectURL(blob); const url = window.URL.createObjectURL(blob);
const a = document.createElement('a'); const a = document.createElement('a');
a.href = url; a.href = url;
a.download = (configName || 'config') + '.csv'; a.download = getFilenameFromResponse(resp) || (configName || 'config') + '.csv';
a.click(); a.click();
window.URL.revokeObjectURL(url); window.URL.revokeObjectURL(url);
} catch(e) { } catch(e) {

View File

@@ -235,6 +235,12 @@
document.addEventListener('DOMContentLoaded', function() { document.addEventListener('DOMContentLoaded', function() {
checkPricelistWritePermission(); checkPricelistWritePermission();
loadPricelists(1); loadPricelists(1);
// Listen for sync completion events from navbar
window.addEventListener('sync-completed', function(e) {
// Reload pricelists on sync completion
loadPricelists(1);
});
}); });
</script> </script>
{{end}} {{end}}

View File

@@ -385,40 +385,48 @@ async function copyProject(projectUUID, projectName) {
loadProjects(); loadProjects();
} }
loadProjects(); document.addEventListener('DOMContentLoaded', function() {
document.getElementById('projects-search').addEventListener('input', function(e) {
projectsSearch = (e.target.value || '').trim();
currentPage = 1;
loadProjects(); loadProjects();
});
document.getElementById('create-project-code').addEventListener('input', function() { document.getElementById('projects-search').addEventListener('input', function(e) {
updateCreateProjectTrackerURL(); projectsSearch = (e.target.value || '').trim();
}); currentPage = 1;
loadProjects();
});
document.getElementById('create-project-tracker-url').addEventListener('input', function(e) { document.getElementById('create-project-code').addEventListener('input', function() {
createProjectTrackerManuallyEdited = (e.target.value || '').trim() !== createProjectLastAutoTrackerURL; updateCreateProjectTrackerURL();
}); });
document.getElementById('create-project-code').addEventListener('keydown', function(e) { document.getElementById('create-project-tracker-url').addEventListener('input', function(e) {
if (e.key === 'Enter') { createProjectTrackerManuallyEdited = (e.target.value || '').trim() !== createProjectLastAutoTrackerURL;
e.preventDefault(); });
createProject();
}
});
document.getElementById('create-project-tracker-url').addEventListener('keydown', function(e) { document.getElementById('create-project-code').addEventListener('keydown', function(e) {
if (e.key === 'Enter') { if (e.key === 'Enter') {
e.preventDefault(); e.preventDefault();
createProject(); createProject();
} }
}); });
document.getElementById('create-project-modal').addEventListener('click', function(e) { document.getElementById('create-project-tracker-url').addEventListener('keydown', function(e) {
if (e.target === this) { if (e.key === 'Enter') {
closeCreateProjectModal(); e.preventDefault();
} createProject();
}
});
document.getElementById('create-project-modal').addEventListener('click', function(e) {
if (e.target === this) {
closeCreateProjectModal();
}
});
// Listen for sync completion events from navbar
window.addEventListener('sync-completed', function(e) {
// Reset pagination and reload projects list
loadProjects();
});
}); });
</script> </script>
{{end}} {{end}}