Add LOGPile BMC diagnostic log analyzer
Features: - Modular parser architecture for vendor-specific formats - Inspur/Kaytus parser supporting asset.json, devicefrusdr.log, component.log, idl.log, and syslog files - PCI Vendor/Device ID lookup for hardware identification - Web interface with tabs: Events, Sensors, Config, Serials, Firmware - Server specification summary with component grouping - Export to CSV, JSON, TXT formats - BMC alarm parsing from IDL logs (memory errors, PSU events, etc.) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
6
.gitignore
vendored
6
.gitignore
vendored
@@ -46,6 +46,12 @@ Temporary Items
|
||||
# Dependency directories (remove the comment below to include it)
|
||||
# vendor/
|
||||
|
||||
# Build output
|
||||
bin/
|
||||
|
||||
# Example data
|
||||
example/
|
||||
|
||||
# Go workspace file
|
||||
go.work
|
||||
go.work.sum
|
||||
|
||||
193
CLAUDE.md
Normal file
193
CLAUDE.md
Normal file
@@ -0,0 +1,193 @@
|
||||
# BMC Analyzer - Инструкции для Claude Code
|
||||
|
||||
## Описание проекта
|
||||
|
||||
Приложение для анализа диагностической информации с BMC серверов (IPMI).
|
||||
Представляет собой standalone Go-бинарник со встроенным веб-интерфейсом.
|
||||
|
||||
### Функциональность
|
||||
|
||||
**Входные данные:**
|
||||
- Архив (tar.gz/zip) с диагностическими данными IPMI сервера
|
||||
|
||||
**Обработка:**
|
||||
- Парсинг System Event Log (SEL) - журнал событий IPMI
|
||||
- Парсинг FRU (Field Replaceable Unit) - серийные номера компонентов
|
||||
- Парсинг конфигурации сервера (CPU, RAM, диски, и т.д.)
|
||||
|
||||
**Выходные данные:**
|
||||
- Веб-интерфейс с человекочитаемой информацией
|
||||
- Экспорт логов в TXT/JSON
|
||||
- Экспорт конфигурации в JSON
|
||||
- Экспорт серийных номеров в CSV
|
||||
|
||||
## Архитектура
|
||||
|
||||
- **Тип:** Standalone бинарник с embedded веб-сервером
|
||||
- **Язык:** Go
|
||||
- **UI:** Embedded HTML + CSS + Vanilla JS (или Alpine.js)
|
||||
- **Порт:** localhost:8080 (по умолчанию)
|
||||
|
||||
## Структура проекта
|
||||
|
||||
```
|
||||
bmc-analyzer/
|
||||
├── cmd/bmc-analyzer/main.go # Точка входа
|
||||
├── internal/
|
||||
│ ├── parser/ # Парсинг архивов и IPMI данных
|
||||
│ ├── models/ # Модели данных
|
||||
│ ├── analyzer/ # Логика анализа
|
||||
│ ├── exporter/ # Экспорт данных
|
||||
│ └── server/ # HTTP сервер и handlers
|
||||
├── web/ # Embedded веб-интерфейс
|
||||
│ ├── static/ # CSS, JS, изображения
|
||||
│ └── templates/ # HTML шаблоны
|
||||
├── testdata/ # Примеры архивов для тестов
|
||||
├── go.mod
|
||||
├── Makefile
|
||||
└── README.md
|
||||
```
|
||||
|
||||
## Технический стек
|
||||
|
||||
### Backend
|
||||
- Go 1.21+
|
||||
- Стандартная библиотека (net/http, archive/tar, compress/gzip)
|
||||
- embed для встраивания веб-ресурсов
|
||||
- Возможно: fiber или gin для роутинга (на ваше усмотрение)
|
||||
|
||||
### Frontend
|
||||
- Vanilla JavaScript или Alpine.js (минимализм)
|
||||
- CSS (можно Tailwind CSS через CDN)
|
||||
- Без сборщиков - всё embedded в бинарник
|
||||
|
||||
### Парсинг IPMI
|
||||
- SEL формат: обычно текстовый вывод `ipmitool sel list` или бинарный
|
||||
- FRU формат: вывод `ipmitool fru print`
|
||||
- Конфигурация: различные текстовые файлы из архива
|
||||
|
||||
## Этапы разработки
|
||||
|
||||
### 1. Базовая структура ✓
|
||||
- [x] Создана структура директорий
|
||||
- [ ] go.mod инициализирован
|
||||
- [ ] Makefile создан
|
||||
|
||||
### 2. Парсер архивов
|
||||
- [ ] Распаковка tar.gz
|
||||
- [ ] Распаковка zip
|
||||
- [ ] Определение типов файлов внутри архива
|
||||
|
||||
### 3. Парсеры IPMI данных
|
||||
- [ ] SEL parser (System Event Log)
|
||||
- [ ] FRU parser (серийные номера)
|
||||
- [ ] Config parser (конфигурация сервера)
|
||||
|
||||
### 4. Модели данных
|
||||
- [ ] Event (события из SEL)
|
||||
- [ ] Hardware (конфигурация)
|
||||
- [ ] SerialNumber (серийники компонентов)
|
||||
|
||||
### 5. Веб-сервер
|
||||
- [ ] HTTP сервер с embedded файлами
|
||||
- [ ] Upload handler для архивов
|
||||
- [ ] API endpoints для получения данных
|
||||
- [ ] Handlers для экспорта
|
||||
|
||||
### 6. Веб-интерфейс
|
||||
- [ ] Главная страница с upload формой
|
||||
- [ ] Отображение событий (timeline/таблица)
|
||||
- [ ] Отображение конфигурации
|
||||
- [ ] Таблица серийных номеров
|
||||
- [ ] Кнопки экспорта
|
||||
|
||||
### 7. Экспортеры
|
||||
- [ ] CSV экспорт (серийники)
|
||||
- [ ] JSON экспорт (конфиг, события)
|
||||
- [ ] TXT отчет (логи)
|
||||
|
||||
### 8. Тестирование и сборка
|
||||
- [ ] Unit тесты для парсеров
|
||||
- [ ] Интеграционные тесты
|
||||
- [ ] Cross-platform сборка (Linux, Windows, Mac)
|
||||
|
||||
## Примеры использования
|
||||
|
||||
```bash
|
||||
# Простой запуск
|
||||
./bmc-analyzer
|
||||
|
||||
# С указанием порта
|
||||
./bmc-analyzer --port 9000
|
||||
|
||||
# С предзагрузкой файла
|
||||
./bmc-analyzer --file /path/to/bmc-archive.tar.gz
|
||||
|
||||
# Кросс-компиляция
|
||||
make build-all
|
||||
```
|
||||
|
||||
## Формат данных IPMI
|
||||
|
||||
### SEL (System Event Log)
|
||||
```
|
||||
SEL Record ID : 0001
|
||||
Record Type : 02
|
||||
Timestamp : 01/15/2025 14:23:45
|
||||
Generator ID : 0020
|
||||
EvM Revision : 04
|
||||
Sensor Type : Temperature
|
||||
Sensor Number : 01
|
||||
Event Type : Threshold
|
||||
Event Direction : Assertion Event
|
||||
Event Data : 010000
|
||||
Description : Upper Critical - going high
|
||||
```
|
||||
|
||||
### FRU (Field Replaceable Unit)
|
||||
```
|
||||
FRU Device Description : Builtin FRU Device (ID 0)
|
||||
Board Mfg Date : Mon Jan 1 00:00:00 1996
|
||||
Board Mfg : Supermicro
|
||||
Board Product : X11DPH-T
|
||||
Board Serial : WM194S001234
|
||||
Board Part Number : X11DPH-TQ
|
||||
```
|
||||
|
||||
## API Endpoints (планируемые)
|
||||
|
||||
```
|
||||
POST /api/upload # Загрузить архив
|
||||
GET /api/events # Получить список событий
|
||||
GET /api/config # Получить конфигурацию
|
||||
GET /api/serials # Получить серийные номера
|
||||
GET /api/export/csv # Экспорт в CSV
|
||||
GET /api/export/json # Экспорт в JSON
|
||||
GET /api/export/txt # Экспорт текстового отчета
|
||||
DELETE /api/clear # Очистить загруженные данные
|
||||
```
|
||||
|
||||
## Следующие шаги
|
||||
|
||||
1. Инициализировать Go модуль
|
||||
2. Создать базовую структуру пакетов
|
||||
3. Реализовать парсер архивов (tar.gz)
|
||||
4. Создать простой HTTP сервер с upload формой
|
||||
5. Реализовать парсинг SEL логов
|
||||
6. Добавить веб-интерфейс для отображения данных
|
||||
|
||||
## Примечания
|
||||
|
||||
- Все файлы веб-интерфейса должны быть embedded в бинарник через `//go:embed`
|
||||
- Приоритет на простоту и минимум зависимостей
|
||||
- Безопасность: валидация загружаемых архивов (размер, типы файлов)
|
||||
- UI должен быть простым и функциональным, не перегруженным
|
||||
- Поддержка русского языка в интерфейсе
|
||||
|
||||
## Вопросы для уточнения
|
||||
|
||||
1. Какие конкретно производители BMC используются? (Supermicro, Dell iDRAC, HP iLO, etc.)
|
||||
2. Есть ли примеры реальных архивов для тестирования?
|
||||
3. Нужна ли поддержка разных форматов SEL (текстовый vs бинарный)?
|
||||
4. Какие метрики/события наиболее важны для анализа?
|
||||
5. Нужна ли фильтрация событий по severity (Critical, Warning, Info)?
|
||||
35
Makefile
Normal file
35
Makefile
Normal file
@@ -0,0 +1,35 @@
|
||||
.PHONY: build run clean test build-all
|
||||
|
||||
BINARY_NAME=logpile
|
||||
VERSION=$(shell git describe --tags --always --dirty 2>/dev/null || echo "dev")
|
||||
COMMIT=$(shell git rev-parse --short HEAD 2>/dev/null || echo "none")
|
||||
LDFLAGS=-ldflags "-X main.version=$(VERSION) -X main.commit=$(COMMIT)"
|
||||
|
||||
build:
|
||||
go build $(LDFLAGS) -o bin/$(BINARY_NAME) ./cmd/logpile
|
||||
|
||||
run: build
|
||||
./bin/$(BINARY_NAME)
|
||||
|
||||
clean:
|
||||
rm -rf bin/
|
||||
|
||||
test:
|
||||
go test -v ./...
|
||||
|
||||
# Cross-platform builds
|
||||
build-all: clean
|
||||
GOOS=linux GOARCH=amd64 go build $(LDFLAGS) -o bin/$(BINARY_NAME)-linux-amd64 ./cmd/logpile
|
||||
GOOS=linux GOARCH=arm64 go build $(LDFLAGS) -o bin/$(BINARY_NAME)-linux-arm64 ./cmd/logpile
|
||||
GOOS=darwin GOARCH=amd64 go build $(LDFLAGS) -o bin/$(BINARY_NAME)-darwin-amd64 ./cmd/logpile
|
||||
GOOS=darwin GOARCH=arm64 go build $(LDFLAGS) -o bin/$(BINARY_NAME)-darwin-arm64 ./cmd/logpile
|
||||
GOOS=windows GOARCH=amd64 go build $(LDFLAGS) -o bin/$(BINARY_NAME)-windows-amd64.exe ./cmd/logpile
|
||||
|
||||
dev:
|
||||
go run ./cmd/logpile
|
||||
|
||||
fmt:
|
||||
go fmt ./...
|
||||
|
||||
lint:
|
||||
golangci-lint run
|
||||
46
cmd/logpile/main.go
Normal file
46
cmd/logpile/main.go
Normal file
@@ -0,0 +1,46 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"flag"
|
||||
"fmt"
|
||||
"log"
|
||||
"os"
|
||||
|
||||
"git.mchus.pro/mchus/logpile/internal/parser"
|
||||
_ "git.mchus.pro/mchus/logpile/internal/parser/vendors" // Register all vendor parsers
|
||||
"git.mchus.pro/mchus/logpile/internal/server"
|
||||
"git.mchus.pro/mchus/logpile/web"
|
||||
)
|
||||
|
||||
var (
|
||||
version = "dev"
|
||||
commit = "none"
|
||||
)
|
||||
|
||||
func main() {
|
||||
port := flag.Int("port", 8080, "HTTP server port")
|
||||
file := flag.String("file", "", "Pre-load archive file")
|
||||
showVersion := flag.Bool("version", false, "Show version")
|
||||
flag.Parse()
|
||||
|
||||
if *showVersion {
|
||||
fmt.Printf("LOGPile %s (commit: %s)\n", version, commit)
|
||||
os.Exit(0)
|
||||
}
|
||||
|
||||
// Set embedded web files
|
||||
server.WebFS = web.FS
|
||||
|
||||
cfg := server.Config{
|
||||
Port: *port,
|
||||
PreloadFile: *file,
|
||||
}
|
||||
|
||||
srv := server.New(cfg)
|
||||
|
||||
log.Printf("LOGPile starting on http://localhost:%d", *port)
|
||||
log.Printf("Registered parsers: %v", parser.ListParsers())
|
||||
if err := srv.Run(); err != nil {
|
||||
log.Fatalf("Server error: %v", err)
|
||||
}
|
||||
}
|
||||
56
internal/analyzer/analyzer.go
Normal file
56
internal/analyzer/analyzer.go
Normal file
@@ -0,0 +1,56 @@
|
||||
package analyzer
|
||||
|
||||
import "git.mchus.pro/mchus/logpile/internal/models"
|
||||
|
||||
// Analyzer processes parsed IPMI data
|
||||
type Analyzer struct {
|
||||
result *models.AnalysisResult
|
||||
}
|
||||
|
||||
// New creates a new analyzer
|
||||
func New() *Analyzer {
|
||||
return &Analyzer{}
|
||||
}
|
||||
|
||||
// SetData sets the data to analyze
|
||||
func (a *Analyzer) SetData(result *models.AnalysisResult) {
|
||||
a.result = result
|
||||
}
|
||||
|
||||
// GetCriticalEvents returns only critical severity events
|
||||
func (a *Analyzer) GetCriticalEvents() []models.Event {
|
||||
if a.result == nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
var critical []models.Event
|
||||
for _, e := range a.result.Events {
|
||||
if e.Severity == models.SeverityCritical {
|
||||
critical = append(critical, e)
|
||||
}
|
||||
}
|
||||
return critical
|
||||
}
|
||||
|
||||
// GetEventsBySensorType returns events filtered by sensor type
|
||||
func (a *Analyzer) GetEventsBySensorType(sensorType string) []models.Event {
|
||||
if a.result == nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
var filtered []models.Event
|
||||
for _, e := range a.result.Events {
|
||||
if e.SensorType == sensorType {
|
||||
filtered = append(filtered, e)
|
||||
}
|
||||
}
|
||||
return filtered
|
||||
}
|
||||
|
||||
// GetAllSerials returns all serial numbers from FRU data
|
||||
func (a *Analyzer) GetAllSerials() []models.FRUInfo {
|
||||
if a.result == nil {
|
||||
return nil
|
||||
}
|
||||
return a.result.FRU
|
||||
}
|
||||
258
internal/exporter/exporter.go
Normal file
258
internal/exporter/exporter.go
Normal file
@@ -0,0 +1,258 @@
|
||||
package exporter
|
||||
|
||||
import (
|
||||
"encoding/csv"
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"io"
|
||||
|
||||
"git.mchus.pro/mchus/logpile/internal/models"
|
||||
)
|
||||
|
||||
// Exporter handles data export in various formats
|
||||
type Exporter struct {
|
||||
result *models.AnalysisResult
|
||||
}
|
||||
|
||||
// New creates a new exporter
|
||||
func New(result *models.AnalysisResult) *Exporter {
|
||||
return &Exporter{result: result}
|
||||
}
|
||||
|
||||
// ExportCSV exports serial numbers to CSV format
|
||||
func (e *Exporter) ExportCSV(w io.Writer) error {
|
||||
writer := csv.NewWriter(w)
|
||||
defer writer.Flush()
|
||||
|
||||
// Header
|
||||
if err := writer.Write([]string{"Component", "Serial Number", "Manufacturer", "Part Number"}); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
if e.result == nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
// FRU data
|
||||
for _, fru := range e.result.FRU {
|
||||
if fru.SerialNumber == "" {
|
||||
continue
|
||||
}
|
||||
name := fru.ProductName
|
||||
if name == "" {
|
||||
name = fru.Description
|
||||
}
|
||||
if err := writer.Write([]string{
|
||||
name,
|
||||
fru.SerialNumber,
|
||||
fru.Manufacturer,
|
||||
fru.PartNumber,
|
||||
}); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
// Hardware data
|
||||
if e.result.Hardware != nil {
|
||||
// Memory
|
||||
for _, mem := range e.result.Hardware.Memory {
|
||||
if mem.SerialNumber == "" {
|
||||
continue
|
||||
}
|
||||
if err := writer.Write([]string{
|
||||
fmt.Sprintf("DIMM Slot %d (%s)", mem.Slot, mem.PartNumber),
|
||||
mem.SerialNumber,
|
||||
mem.Manufacturer,
|
||||
mem.PartNumber,
|
||||
}); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
// Storage
|
||||
for _, stor := range e.result.Hardware.Storage {
|
||||
if stor.SerialNumber == "" {
|
||||
continue
|
||||
}
|
||||
if err := writer.Write([]string{
|
||||
fmt.Sprintf("%s %s", stor.Type, stor.Model),
|
||||
stor.SerialNumber,
|
||||
"",
|
||||
"",
|
||||
}); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
// PCIe devices
|
||||
for _, pcie := range e.result.Hardware.PCIeDevices {
|
||||
if pcie.SerialNumber == "" {
|
||||
continue
|
||||
}
|
||||
if err := writer.Write([]string{
|
||||
fmt.Sprintf("%s (%s)", pcie.DeviceClass, pcie.Slot),
|
||||
pcie.SerialNumber,
|
||||
"",
|
||||
pcie.PartNumber,
|
||||
}); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// ExportJSON exports all data to JSON format
|
||||
func (e *Exporter) ExportJSON(w io.Writer) error {
|
||||
encoder := json.NewEncoder(w)
|
||||
encoder.SetIndent("", " ")
|
||||
return encoder.Encode(e.result)
|
||||
}
|
||||
|
||||
// ExportTXT exports a human-readable text report
|
||||
func (e *Exporter) ExportTXT(w io.Writer) error {
|
||||
fmt.Fprintln(w, "LOGPile Analysis Report")
|
||||
fmt.Fprintln(w, "========================")
|
||||
fmt.Fprintln(w)
|
||||
|
||||
if e.result == nil {
|
||||
fmt.Fprintln(w, "No data loaded.")
|
||||
return nil
|
||||
}
|
||||
|
||||
fmt.Fprintf(w, "File: %s\n\n", e.result.Filename)
|
||||
|
||||
// Hardware summary
|
||||
if e.result.Hardware != nil {
|
||||
hw := e.result.Hardware
|
||||
|
||||
// Firmware
|
||||
if len(hw.Firmware) > 0 {
|
||||
fmt.Fprintln(w, "FIRMWARE VERSIONS")
|
||||
fmt.Fprintln(w, "-----------------")
|
||||
for _, fw := range hw.Firmware {
|
||||
fmt.Fprintf(w, " %s: %s\n", fw.DeviceName, fw.Version)
|
||||
}
|
||||
fmt.Fprintln(w)
|
||||
}
|
||||
|
||||
// CPUs
|
||||
if len(hw.CPUs) > 0 {
|
||||
fmt.Fprintln(w, "PROCESSORS")
|
||||
fmt.Fprintln(w, "----------")
|
||||
for _, cpu := range hw.CPUs {
|
||||
fmt.Fprintf(w, " Socket %d: %s\n", cpu.Socket, cpu.Model)
|
||||
fmt.Fprintf(w, " Cores: %d, Threads: %d, Freq: %d MHz (Turbo: %d MHz)\n",
|
||||
cpu.Cores, cpu.Threads, cpu.FrequencyMHz, cpu.MaxFreqMHz)
|
||||
fmt.Fprintf(w, " TDP: %dW, L3 Cache: %d KB\n", cpu.TDP, cpu.L3CacheKB)
|
||||
}
|
||||
fmt.Fprintln(w)
|
||||
}
|
||||
|
||||
// Memory
|
||||
if len(hw.Memory) > 0 {
|
||||
fmt.Fprintln(w, "MEMORY")
|
||||
fmt.Fprintln(w, "------")
|
||||
totalMB := 0
|
||||
for _, mem := range hw.Memory {
|
||||
totalMB += mem.SizeMB
|
||||
}
|
||||
fmt.Fprintf(w, " Total: %d GB (%d DIMMs)\n", totalMB/1024, len(hw.Memory))
|
||||
fmt.Fprintf(w, " Type: %s @ %d MHz\n", hw.Memory[0].Type, hw.Memory[0].SpeedMHz)
|
||||
fmt.Fprintf(w, " Manufacturer: %s\n", hw.Memory[0].Manufacturer)
|
||||
fmt.Fprintln(w)
|
||||
}
|
||||
|
||||
// Storage
|
||||
if len(hw.Storage) > 0 {
|
||||
fmt.Fprintln(w, "STORAGE")
|
||||
fmt.Fprintln(w, "-------")
|
||||
for _, stor := range hw.Storage {
|
||||
fmt.Fprintf(w, " %s: %s (%d GB) - S/N: %s\n",
|
||||
stor.Slot, stor.Model, stor.SizeGB, stor.SerialNumber)
|
||||
}
|
||||
fmt.Fprintln(w)
|
||||
}
|
||||
|
||||
// PCIe
|
||||
if len(hw.PCIeDevices) > 0 {
|
||||
fmt.Fprintln(w, "PCIE DEVICES")
|
||||
fmt.Fprintln(w, "------------")
|
||||
for _, pcie := range hw.PCIeDevices {
|
||||
fmt.Fprintf(w, " %s: %s (x%d %s)\n",
|
||||
pcie.Slot, pcie.DeviceClass, pcie.LinkWidth, pcie.LinkSpeed)
|
||||
if pcie.SerialNumber != "" {
|
||||
fmt.Fprintf(w, " S/N: %s\n", pcie.SerialNumber)
|
||||
}
|
||||
if len(pcie.MACAddresses) > 0 {
|
||||
fmt.Fprintf(w, " MACs: %v\n", pcie.MACAddresses)
|
||||
}
|
||||
}
|
||||
fmt.Fprintln(w)
|
||||
}
|
||||
}
|
||||
|
||||
// Sensors summary
|
||||
if len(e.result.Sensors) > 0 {
|
||||
fmt.Fprintln(w, "SENSOR READINGS")
|
||||
fmt.Fprintln(w, "---------------")
|
||||
|
||||
// Group by type
|
||||
byType := make(map[string][]models.SensorReading)
|
||||
for _, s := range e.result.Sensors {
|
||||
byType[s.Type] = append(byType[s.Type], s)
|
||||
}
|
||||
|
||||
for stype, sensors := range byType {
|
||||
fmt.Fprintf(w, "\n %s:\n", stype)
|
||||
for _, s := range sensors {
|
||||
if s.Value != 0 {
|
||||
fmt.Fprintf(w, " %s: %.0f %s [%s]\n", s.Name, s.Value, s.Unit, s.Status)
|
||||
} else if s.RawValue != "" {
|
||||
fmt.Fprintf(w, " %s: %s [%s]\n", s.Name, s.RawValue, s.Status)
|
||||
}
|
||||
}
|
||||
}
|
||||
fmt.Fprintln(w)
|
||||
}
|
||||
|
||||
// FRU summary
|
||||
if len(e.result.FRU) > 0 {
|
||||
fmt.Fprintln(w, "FRU COMPONENTS")
|
||||
fmt.Fprintln(w, "--------------")
|
||||
for _, fru := range e.result.FRU {
|
||||
name := fru.ProductName
|
||||
if name == "" {
|
||||
name = fru.Description
|
||||
}
|
||||
fmt.Fprintf(w, " %s\n", name)
|
||||
if fru.SerialNumber != "" {
|
||||
fmt.Fprintf(w, " Serial: %s\n", fru.SerialNumber)
|
||||
}
|
||||
if fru.Manufacturer != "" {
|
||||
fmt.Fprintf(w, " Manufacturer: %s\n", fru.Manufacturer)
|
||||
}
|
||||
}
|
||||
fmt.Fprintln(w)
|
||||
}
|
||||
|
||||
// Events summary
|
||||
fmt.Fprintf(w, "EVENTS: %d total\n", len(e.result.Events))
|
||||
var critical, warning, info int
|
||||
for _, ev := range e.result.Events {
|
||||
switch ev.Severity {
|
||||
case models.SeverityCritical:
|
||||
critical++
|
||||
case models.SeverityWarning:
|
||||
warning++
|
||||
case models.SeverityInfo:
|
||||
info++
|
||||
}
|
||||
}
|
||||
fmt.Fprintf(w, " Critical: %d\n", critical)
|
||||
fmt.Fprintf(w, " Warning: %d\n", warning)
|
||||
fmt.Fprintf(w, " Info: %d\n", info)
|
||||
|
||||
return nil
|
||||
}
|
||||
159
internal/models/models.go
Normal file
159
internal/models/models.go
Normal file
@@ -0,0 +1,159 @@
|
||||
package models
|
||||
|
||||
import "time"
|
||||
|
||||
// AnalysisResult contains all parsed data from an archive
|
||||
type AnalysisResult struct {
|
||||
Filename string `json:"filename"`
|
||||
Events []Event `json:"events"`
|
||||
FRU []FRUInfo `json:"fru"`
|
||||
Sensors []SensorReading `json:"sensors"`
|
||||
Hardware *HardwareConfig `json:"hardware"`
|
||||
}
|
||||
|
||||
// Event represents a single log event
|
||||
type Event struct {
|
||||
ID string `json:"id"`
|
||||
Timestamp time.Time `json:"timestamp"`
|
||||
Source string `json:"source"`
|
||||
SensorType string `json:"sensor_type"`
|
||||
SensorName string `json:"sensor_name"`
|
||||
EventType string `json:"event_type"`
|
||||
Severity Severity `json:"severity"`
|
||||
Description string `json:"description"`
|
||||
RawData string `json:"raw_data,omitempty"`
|
||||
}
|
||||
|
||||
// Severity represents event severity level
|
||||
type Severity string
|
||||
|
||||
const (
|
||||
SeverityCritical Severity = "critical"
|
||||
SeverityWarning Severity = "warning"
|
||||
SeverityInfo Severity = "info"
|
||||
)
|
||||
|
||||
// SensorReading represents a single sensor reading
|
||||
type SensorReading struct {
|
||||
Name string `json:"name"`
|
||||
Type string `json:"type"`
|
||||
Value float64 `json:"value,omitempty"`
|
||||
Unit string `json:"unit,omitempty"`
|
||||
RawValue string `json:"raw_value,omitempty"`
|
||||
Status string `json:"status"`
|
||||
}
|
||||
|
||||
// FRUInfo represents Field Replaceable Unit information
|
||||
type FRUInfo struct {
|
||||
DeviceID string `json:"device_id,omitempty"`
|
||||
Description string `json:"description"`
|
||||
ChassisType string `json:"chassis_type,omitempty"`
|
||||
Manufacturer string `json:"manufacturer,omitempty"`
|
||||
ProductName string `json:"product_name,omitempty"`
|
||||
SerialNumber string `json:"serial_number,omitempty"`
|
||||
PartNumber string `json:"part_number,omitempty"`
|
||||
Version string `json:"version,omitempty"`
|
||||
MfgDate string `json:"mfg_date,omitempty"`
|
||||
AssetTag string `json:"asset_tag,omitempty"`
|
||||
}
|
||||
|
||||
// HardwareConfig represents server hardware configuration
|
||||
type HardwareConfig struct {
|
||||
Firmware []FirmwareInfo `json:"firmware,omitempty"`
|
||||
BoardInfo BoardInfo `json:"board,omitempty"`
|
||||
CPUs []CPU `json:"cpus,omitempty"`
|
||||
Memory []MemoryDIMM `json:"memory,omitempty"`
|
||||
Storage []Storage `json:"storage,omitempty"`
|
||||
PCIeDevices []PCIeDevice `json:"pcie_devices,omitempty"`
|
||||
NetworkCards []NIC `json:"network_cards,omitempty"`
|
||||
PowerSupply []PSU `json:"power_supplies,omitempty"`
|
||||
}
|
||||
|
||||
// FirmwareInfo represents firmware version information
|
||||
type FirmwareInfo struct {
|
||||
DeviceName string `json:"device_name"`
|
||||
Version string `json:"version"`
|
||||
BuildTime string `json:"build_time,omitempty"`
|
||||
}
|
||||
|
||||
// BoardInfo represents motherboard information
|
||||
type BoardInfo struct {
|
||||
Manufacturer string `json:"manufacturer,omitempty"`
|
||||
ProductName string `json:"product_name,omitempty"`
|
||||
SerialNumber string `json:"serial_number,omitempty"`
|
||||
PartNumber string `json:"part_number,omitempty"`
|
||||
}
|
||||
|
||||
// CPU represents processor information
|
||||
type CPU struct {
|
||||
Socket int `json:"socket"`
|
||||
Model string `json:"model"`
|
||||
Cores int `json:"cores"`
|
||||
Threads int `json:"threads"`
|
||||
FrequencyMHz int `json:"frequency_mhz"`
|
||||
MaxFreqMHz int `json:"max_frequency_mhz,omitempty"`
|
||||
L1CacheKB int `json:"l1_cache_kb,omitempty"`
|
||||
L2CacheKB int `json:"l2_cache_kb,omitempty"`
|
||||
L3CacheKB int `json:"l3_cache_kb,omitempty"`
|
||||
TDP int `json:"tdp_w,omitempty"`
|
||||
PPIN string `json:"ppin,omitempty"`
|
||||
SerialNumber string `json:"serial_number,omitempty"`
|
||||
}
|
||||
|
||||
// MemoryDIMM represents a memory module
|
||||
type MemoryDIMM struct {
|
||||
Slot int `json:"slot"`
|
||||
SizeMB int `json:"size_mb"`
|
||||
Type string `json:"type"`
|
||||
SpeedMHz int `json:"speed_mhz"`
|
||||
Manufacturer string `json:"manufacturer,omitempty"`
|
||||
SerialNumber string `json:"serial_number,omitempty"`
|
||||
PartNumber string `json:"part_number,omitempty"`
|
||||
}
|
||||
|
||||
// Storage represents a storage device
|
||||
type Storage struct {
|
||||
Slot string `json:"slot"`
|
||||
Type string `json:"type"`
|
||||
Model string `json:"model"`
|
||||
SizeGB int `json:"size_gb"`
|
||||
SerialNumber string `json:"serial_number,omitempty"`
|
||||
Manufacturer string `json:"manufacturer,omitempty"`
|
||||
Firmware string `json:"firmware,omitempty"`
|
||||
Interface string `json:"interface,omitempty"`
|
||||
}
|
||||
|
||||
// PCIeDevice represents a PCIe device
|
||||
type PCIeDevice struct {
|
||||
Slot string `json:"slot"`
|
||||
VendorID int `json:"vendor_id"`
|
||||
DeviceID int `json:"device_id"`
|
||||
BDF string `json:"bdf"`
|
||||
DeviceClass string `json:"device_class"`
|
||||
Manufacturer string `json:"manufacturer,omitempty"`
|
||||
LinkWidth int `json:"link_width"`
|
||||
LinkSpeed string `json:"link_speed"`
|
||||
MaxLinkWidth int `json:"max_link_width"`
|
||||
MaxLinkSpeed string `json:"max_link_speed"`
|
||||
PartNumber string `json:"part_number,omitempty"`
|
||||
SerialNumber string `json:"serial_number,omitempty"`
|
||||
MACAddresses []string `json:"mac_addresses,omitempty"`
|
||||
}
|
||||
|
||||
// NIC represents a network interface card
|
||||
type NIC struct {
|
||||
Name string `json:"name"`
|
||||
Model string `json:"model"`
|
||||
MACAddress string `json:"mac_address"`
|
||||
SpeedMbps int `json:"speed_mbps,omitempty"`
|
||||
SerialNumber string `json:"serial_number,omitempty"`
|
||||
}
|
||||
|
||||
// PSU represents a power supply unit
|
||||
type PSU struct {
|
||||
Slot string `json:"slot"`
|
||||
Model string `json:"model"`
|
||||
WattageW int `json:"wattage_w,omitempty"`
|
||||
SerialNumber string `json:"serial_number,omitempty"`
|
||||
Status string `json:"status,omitempty"`
|
||||
}
|
||||
160
internal/parser/archive.go
Normal file
160
internal/parser/archive.go
Normal file
@@ -0,0 +1,160 @@
|
||||
package parser
|
||||
|
||||
import (
|
||||
"archive/tar"
|
||||
"archive/zip"
|
||||
"compress/gzip"
|
||||
"fmt"
|
||||
"io"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
)
|
||||
|
||||
// ExtractedFile represents a file extracted from archive
|
||||
type ExtractedFile struct {
|
||||
Path string
|
||||
Content []byte
|
||||
}
|
||||
|
||||
// ExtractArchive extracts tar.gz or zip archive and returns file contents
|
||||
func ExtractArchive(archivePath string) ([]ExtractedFile, error) {
|
||||
ext := strings.ToLower(filepath.Ext(archivePath))
|
||||
|
||||
switch ext {
|
||||
case ".gz", ".tgz":
|
||||
return extractTarGz(archivePath)
|
||||
case ".zip":
|
||||
return extractZip(archivePath)
|
||||
default:
|
||||
return nil, fmt.Errorf("unsupported archive format: %s", ext)
|
||||
}
|
||||
}
|
||||
|
||||
// ExtractArchiveFromReader extracts archive from reader
|
||||
func ExtractArchiveFromReader(r io.Reader, filename string) ([]ExtractedFile, error) {
|
||||
ext := strings.ToLower(filepath.Ext(filename))
|
||||
|
||||
switch ext {
|
||||
case ".gz", ".tgz":
|
||||
return extractTarGzFromReader(r)
|
||||
default:
|
||||
return nil, fmt.Errorf("unsupported archive format: %s", ext)
|
||||
}
|
||||
}
|
||||
|
||||
func extractTarGz(archivePath string) ([]ExtractedFile, error) {
|
||||
f, err := os.Open(archivePath)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("open archive: %w", err)
|
||||
}
|
||||
defer f.Close()
|
||||
|
||||
return extractTarGzFromReader(f)
|
||||
}
|
||||
|
||||
func extractTarGzFromReader(r io.Reader) ([]ExtractedFile, error) {
|
||||
gzr, err := gzip.NewReader(r)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("gzip reader: %w", err)
|
||||
}
|
||||
defer gzr.Close()
|
||||
|
||||
tr := tar.NewReader(gzr)
|
||||
var files []ExtractedFile
|
||||
|
||||
for {
|
||||
header, err := tr.Next()
|
||||
if err == io.EOF {
|
||||
break
|
||||
}
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("tar read: %w", err)
|
||||
}
|
||||
|
||||
// Skip directories
|
||||
if header.Typeflag == tar.TypeDir {
|
||||
continue
|
||||
}
|
||||
|
||||
// Skip large files (>10MB)
|
||||
if header.Size > 10*1024*1024 {
|
||||
continue
|
||||
}
|
||||
|
||||
content, err := io.ReadAll(tr)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("read file %s: %w", header.Name, err)
|
||||
}
|
||||
|
||||
files = append(files, ExtractedFile{
|
||||
Path: header.Name,
|
||||
Content: content,
|
||||
})
|
||||
}
|
||||
|
||||
return files, nil
|
||||
}
|
||||
|
||||
func extractZip(archivePath string) ([]ExtractedFile, error) {
|
||||
r, err := zip.OpenReader(archivePath)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("open zip: %w", err)
|
||||
}
|
||||
defer r.Close()
|
||||
|
||||
var files []ExtractedFile
|
||||
|
||||
for _, f := range r.File {
|
||||
if f.FileInfo().IsDir() {
|
||||
continue
|
||||
}
|
||||
|
||||
// Skip large files (>10MB)
|
||||
if f.FileInfo().Size() > 10*1024*1024 {
|
||||
continue
|
||||
}
|
||||
|
||||
rc, err := f.Open()
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("open file %s: %w", f.Name, err)
|
||||
}
|
||||
|
||||
content, err := io.ReadAll(rc)
|
||||
rc.Close()
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("read file %s: %w", f.Name, err)
|
||||
}
|
||||
|
||||
files = append(files, ExtractedFile{
|
||||
Path: f.Name,
|
||||
Content: content,
|
||||
})
|
||||
}
|
||||
|
||||
return files, nil
|
||||
}
|
||||
|
||||
// FindFileByPattern finds files matching pattern in extracted files
|
||||
func FindFileByPattern(files []ExtractedFile, patterns ...string) []ExtractedFile {
|
||||
var result []ExtractedFile
|
||||
for _, f := range files {
|
||||
for _, pattern := range patterns {
|
||||
if strings.Contains(strings.ToLower(f.Path), strings.ToLower(pattern)) {
|
||||
result = append(result, f)
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
// FindFileByName finds file by exact name (case-insensitive)
|
||||
func FindFileByName(files []ExtractedFile, name string) *ExtractedFile {
|
||||
for _, f := range files {
|
||||
if strings.EqualFold(filepath.Base(f.Path), name) {
|
||||
return &f
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
30
internal/parser/interface.go
Normal file
30
internal/parser/interface.go
Normal file
@@ -0,0 +1,30 @@
|
||||
package parser
|
||||
|
||||
import (
|
||||
"git.mchus.pro/mchus/logpile/internal/models"
|
||||
)
|
||||
|
||||
// VendorParser interface for vendor-specific parsers
|
||||
type VendorParser interface {
|
||||
// Name returns human-readable parser name
|
||||
Name() string
|
||||
|
||||
// Vendor returns vendor identifier (e.g., "inspur", "supermicro", "dell")
|
||||
Vendor() string
|
||||
|
||||
// Detect checks if this parser can handle the given files
|
||||
// Returns confidence score 0-100 (0 = cannot parse, 100 = definitely this format)
|
||||
Detect(files []ExtractedFile) int
|
||||
|
||||
// Parse parses the extracted files and returns analysis result
|
||||
Parse(files []ExtractedFile) (*models.AnalysisResult, error)
|
||||
}
|
||||
|
||||
// FileParser interface for parsing specific file types within vendor module
|
||||
type FileParser interface {
|
||||
// CanParse checks if this parser can handle the file
|
||||
CanParse(file ExtractedFile) bool
|
||||
|
||||
// Parse parses the file content
|
||||
Parse(content []byte) error
|
||||
}
|
||||
86
internal/parser/parser.go
Normal file
86
internal/parser/parser.go
Normal file
@@ -0,0 +1,86 @@
|
||||
package parser
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"io"
|
||||
|
||||
"git.mchus.pro/mchus/logpile/internal/models"
|
||||
)
|
||||
|
||||
// BMCParser parses BMC diagnostic archives using vendor-specific parsers
|
||||
type BMCParser struct {
|
||||
result *models.AnalysisResult
|
||||
files []ExtractedFile
|
||||
vendorParser VendorParser
|
||||
}
|
||||
|
||||
// NewBMCParser creates a new BMC parser
|
||||
func NewBMCParser() *BMCParser {
|
||||
return &BMCParser{
|
||||
result: &models.AnalysisResult{
|
||||
Events: make([]models.Event, 0),
|
||||
FRU: make([]models.FRUInfo, 0),
|
||||
Sensors: make([]models.SensorReading, 0),
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
// ParseArchive parses an archive file
|
||||
func (p *BMCParser) ParseArchive(archivePath string) error {
|
||||
files, err := ExtractArchive(archivePath)
|
||||
if err != nil {
|
||||
return fmt.Errorf("extract archive: %w", err)
|
||||
}
|
||||
p.files = files
|
||||
return p.parseFiles()
|
||||
}
|
||||
|
||||
// ParseFromReader parses archive from reader
|
||||
func (p *BMCParser) ParseFromReader(r io.Reader, filename string) error {
|
||||
files, err := ExtractArchiveFromReader(r, filename)
|
||||
if err != nil {
|
||||
return fmt.Errorf("extract archive: %w", err)
|
||||
}
|
||||
p.files = files
|
||||
p.result.Filename = filename
|
||||
return p.parseFiles()
|
||||
}
|
||||
|
||||
func (p *BMCParser) parseFiles() error {
|
||||
// Auto-detect format
|
||||
vendorParser, err := DetectFormat(p.files)
|
||||
if err != nil {
|
||||
return fmt.Errorf("detect format: %w", err)
|
||||
}
|
||||
p.vendorParser = vendorParser
|
||||
|
||||
// Parse using detected vendor parser
|
||||
result, err := vendorParser.Parse(p.files)
|
||||
if err != nil {
|
||||
return fmt.Errorf("parse with %s: %w", vendorParser.Name(), err)
|
||||
}
|
||||
|
||||
// Preserve filename
|
||||
result.Filename = p.result.Filename
|
||||
p.result = result
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// Result returns the analysis result
|
||||
func (p *BMCParser) Result() *models.AnalysisResult {
|
||||
return p.result
|
||||
}
|
||||
|
||||
// GetFiles returns all extracted files
|
||||
func (p *BMCParser) GetFiles() []ExtractedFile {
|
||||
return p.files
|
||||
}
|
||||
|
||||
// DetectedVendor returns the detected vendor parser name
|
||||
func (p *BMCParser) DetectedVendor() string {
|
||||
if p.vendorParser != nil {
|
||||
return p.vendorParser.Name()
|
||||
}
|
||||
return "unknown"
|
||||
}
|
||||
106
internal/parser/registry.go
Normal file
106
internal/parser/registry.go
Normal file
@@ -0,0 +1,106 @@
|
||||
package parser
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"sort"
|
||||
"sync"
|
||||
)
|
||||
|
||||
var (
|
||||
registry = make(map[string]VendorParser)
|
||||
registryLock sync.RWMutex
|
||||
)
|
||||
|
||||
// Register adds a vendor parser to the registry
|
||||
// Called from vendor module init() functions
|
||||
func Register(p VendorParser) {
|
||||
registryLock.Lock()
|
||||
defer registryLock.Unlock()
|
||||
|
||||
vendor := p.Vendor()
|
||||
if _, exists := registry[vendor]; exists {
|
||||
panic(fmt.Sprintf("parser already registered for vendor: %s", vendor))
|
||||
}
|
||||
registry[vendor] = p
|
||||
}
|
||||
|
||||
// GetParser returns parser for specific vendor
|
||||
func GetParser(vendor string) (VendorParser, bool) {
|
||||
registryLock.RLock()
|
||||
defer registryLock.RUnlock()
|
||||
|
||||
p, ok := registry[vendor]
|
||||
return p, ok
|
||||
}
|
||||
|
||||
// ListParsers returns list of all registered vendor names
|
||||
func ListParsers() []string {
|
||||
registryLock.RLock()
|
||||
defer registryLock.RUnlock()
|
||||
|
||||
vendors := make([]string, 0, len(registry))
|
||||
for v := range registry {
|
||||
vendors = append(vendors, v)
|
||||
}
|
||||
sort.Strings(vendors)
|
||||
return vendors
|
||||
}
|
||||
|
||||
// DetectResult holds detection result for a parser
|
||||
type DetectResult struct {
|
||||
Parser VendorParser
|
||||
Confidence int
|
||||
}
|
||||
|
||||
// DetectFormat tries to detect archive format and returns best matching parser
|
||||
func DetectFormat(files []ExtractedFile) (VendorParser, error) {
|
||||
registryLock.RLock()
|
||||
defer registryLock.RUnlock()
|
||||
|
||||
var results []DetectResult
|
||||
|
||||
for _, p := range registry {
|
||||
confidence := p.Detect(files)
|
||||
if confidence > 0 {
|
||||
results = append(results, DetectResult{
|
||||
Parser: p,
|
||||
Confidence: confidence,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
if len(results) == 0 {
|
||||
return nil, fmt.Errorf("no parser found for this archive format")
|
||||
}
|
||||
|
||||
// Sort by confidence descending
|
||||
sort.Slice(results, func(i, j int) bool {
|
||||
return results[i].Confidence > results[j].Confidence
|
||||
})
|
||||
|
||||
return results[0].Parser, nil
|
||||
}
|
||||
|
||||
// DetectAllFormats returns all parsers that can handle the files with their confidence
|
||||
func DetectAllFormats(files []ExtractedFile) []DetectResult {
|
||||
registryLock.RLock()
|
||||
defer registryLock.RUnlock()
|
||||
|
||||
var results []DetectResult
|
||||
|
||||
for _, p := range registry {
|
||||
confidence := p.Detect(files)
|
||||
if confidence > 0 {
|
||||
results = append(results, DetectResult{
|
||||
Parser: p,
|
||||
Confidence: confidence,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
sort.Slice(results, func(i, j int) bool {
|
||||
return results[i].Confidence > results[j].Confidence
|
||||
})
|
||||
|
||||
return results
|
||||
}
|
||||
96
internal/parser/vendors/README.md
vendored
Normal file
96
internal/parser/vendors/README.md
vendored
Normal file
@@ -0,0 +1,96 @@
|
||||
# Vendor Parser Modules
|
||||
|
||||
Каждый производитель серверов имеет свой формат диагностических архивов BMC.
|
||||
Эта директория содержит модули парсеров для разных производителей.
|
||||
|
||||
## Структура модуля
|
||||
|
||||
```
|
||||
vendors/
|
||||
├── vendors.go # Импорты всех модулей (добавьте сюда новый)
|
||||
├── README.md # Эта документация
|
||||
├── template/ # Шаблон для нового модуля
|
||||
│ └── parser.go.template
|
||||
├── inspur/ # Модуль Inspur/Kaytus
|
||||
│ ├── parser.go # Основной парсер + регистрация
|
||||
│ ├── sdr.go # Парсинг SDR (сенсоры)
|
||||
│ ├── fru.go # Парсинг FRU (серийники)
|
||||
│ ├── asset.go # Парсинг asset.json
|
||||
│ └── syslog.go # Парсинг syslog
|
||||
├── supermicro/ # Будущий модуль Supermicro
|
||||
├── dell/ # Будущий модуль Dell iDRAC
|
||||
└── hpe/ # Будущий модуль HPE iLO
|
||||
```
|
||||
|
||||
## Как добавить новый модуль
|
||||
|
||||
### 1. Создайте директорию модуля
|
||||
|
||||
```bash
|
||||
mkdir -p internal/parser/vendors/VENDORNAME
|
||||
```
|
||||
|
||||
### 2. Скопируйте шаблон
|
||||
|
||||
```bash
|
||||
cp internal/parser/vendors/template/parser.go.template \
|
||||
internal/parser/vendors/VENDORNAME/parser.go
|
||||
```
|
||||
|
||||
### 3. Отредактируйте parser.go
|
||||
|
||||
- Замените `VENDORNAME` на идентификатор вендора (например, `supermicro`)
|
||||
- Замените `VENDOR_DESCRIPTION` на описание (например, `Supermicro`)
|
||||
- Реализуйте метод `Detect()` для определения формата
|
||||
- Реализуйте метод `Parse()` для парсинга данных
|
||||
|
||||
### 4. Зарегистрируйте модуль
|
||||
|
||||
Добавьте импорт в `vendors/vendors.go`:
|
||||
|
||||
```go
|
||||
import (
|
||||
_ "git.mchus.pro/mchus/logpile/internal/parser/vendors/inspur"
|
||||
_ "git.mchus.pro/mchus/logpile/internal/parser/vendors/VENDORNAME" // Новый модуль
|
||||
)
|
||||
```
|
||||
|
||||
### 5. Готово!
|
||||
|
||||
Модуль автоматически зарегистрируется при старте приложения через `init()`.
|
||||
|
||||
## Интерфейс VendorParser
|
||||
|
||||
```go
|
||||
type VendorParser interface {
|
||||
// Name возвращает человекочитаемое имя парсера
|
||||
Name() string
|
||||
|
||||
// Vendor возвращает идентификатор вендора
|
||||
Vendor() string
|
||||
|
||||
// Detect проверяет, подходит ли этот парсер для файлов
|
||||
// Возвращает уверенность 0-100 (0 = не подходит, 100 = точно этот формат)
|
||||
Detect(files []ExtractedFile) int
|
||||
|
||||
// Parse парсит извлеченные файлы
|
||||
Parse(files []ExtractedFile) (*models.AnalysisResult, error)
|
||||
}
|
||||
```
|
||||
|
||||
## Советы по реализации Detect()
|
||||
|
||||
- Ищите уникальные файлы/директории для данного вендора
|
||||
- Проверяйте содержимое файлов на характерные маркеры
|
||||
- Возвращайте высокий confidence (70+) только при уверенном совпадении
|
||||
- Несколько парсеров могут вернуть >0, выбирается с максимальным confidence
|
||||
|
||||
## Поддерживаемые вендоры
|
||||
|
||||
| Вендор | Идентификатор | Статус | Протестировано на |
|
||||
|--------|---------------|--------|-------------------|
|
||||
| Inspur/Kaytus | `inspur` | ✅ Готов | KR4268X2 (onekeylog) |
|
||||
| Supermicro | `supermicro` | ⏳ Планируется | - |
|
||||
| Dell iDRAC | `dell` | ⏳ Планируется | - |
|
||||
| HPE iLO | `hpe` | ⏳ Планируется | - |
|
||||
| Lenovo XCC | `lenovo` | ⏳ Планируется | - |
|
||||
352
internal/parser/vendors/inspur/asset.go
vendored
Normal file
352
internal/parser/vendors/inspur/asset.go
vendored
Normal file
@@ -0,0 +1,352 @@
|
||||
package inspur
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"strings"
|
||||
|
||||
"git.mchus.pro/mchus/logpile/internal/models"
|
||||
"git.mchus.pro/mchus/logpile/internal/parser/vendors/pciids"
|
||||
)
|
||||
|
||||
// AssetJSON represents the structure of Inspur asset.json file
|
||||
type AssetJSON struct {
|
||||
VersionInfo []struct {
|
||||
DeviceID int `json:"DeviceId"`
|
||||
DeviceName string `json:"DeviceName"`
|
||||
DeviceRevision string `json:"DeviceRevision"`
|
||||
BuildTime string `json:"BuildTime"`
|
||||
} `json:"VersionInfo"`
|
||||
|
||||
CpuInfo []struct {
|
||||
ProcessorName string `json:"ProcessorName"`
|
||||
ProcessorID string `json:"ProcessorId"`
|
||||
MicroCodeVer string `json:"MicroCodeVer"`
|
||||
CurrentSpeed int `json:"CurrentSpeed"`
|
||||
Core int `json:"Core"`
|
||||
ThreadCount int `json:"ThreadCount"`
|
||||
L1Cache int `json:"L1Cache"`
|
||||
L2Cache int `json:"L2Cache"`
|
||||
L3Cache int `json:"L3Cache"`
|
||||
CpuTdp int `json:"CpuTdp"`
|
||||
PPIN string `json:"PPIN"`
|
||||
TurboEnableMaxSpeed int `json:"TurboEnableMaxSpeed"`
|
||||
TurboCloseMaxSpeed int `json:"TurboCloseMaxSpeed"`
|
||||
UPIBandwidth string `json:"UPIBandwidth"`
|
||||
} `json:"CpuInfo"`
|
||||
|
||||
MemInfo struct {
|
||||
MemCommonInfo []struct {
|
||||
Manufacturer string `json:"Manufacturer"`
|
||||
MaxSpeed int `json:"MaxSpeed"`
|
||||
CurrentSpeed int `json:"CurrentSpeed"`
|
||||
MemoryType int `json:"MemoryType"`
|
||||
Rank int `json:"Rank"`
|
||||
DataWidth int `json:"DataWidth"`
|
||||
ConfiguredVoltage int `json:"ConfiguredVoltage"`
|
||||
PhysicalSize int `json:"PhysicalSize"`
|
||||
} `json:"MemCommonInfo"`
|
||||
|
||||
DimmInfo []struct {
|
||||
SerialNumber string `json:"SerialNumber"`
|
||||
PartNumber string `json:"PartNumber"`
|
||||
AssetTag string `json:"AssetTag"`
|
||||
} `json:"DimmInfo"`
|
||||
} `json:"MemInfo"`
|
||||
|
||||
HddInfo []struct {
|
||||
SerialNumber string `json:"SerialNumber"`
|
||||
Manufacturer string `json:"Manufacturer"`
|
||||
ModelName string `json:"ModelName"`
|
||||
FirmwareVersion string `json:"FirmwareVersion"`
|
||||
Capacity int `json:"Capacity"`
|
||||
Location int `json:"Location"`
|
||||
DiskInterfaceType int `json:"DiskInterfaceType"`
|
||||
MediaType int `json:"MediaType"`
|
||||
LocationString string `json:"LocationString"`
|
||||
BlockSizeBytes int `json:"BlockSizeBytes"`
|
||||
CapableSpeedGbs string `json:"CapableSpeedGbs"`
|
||||
NegotiatedSpeedGbs string `json:"NegotiatedSpeedGbs"`
|
||||
PcieSlot int `json:"PcieSlot"`
|
||||
} `json:"HddInfo"`
|
||||
|
||||
PcieInfo []struct {
|
||||
VendorId int `json:"VendorId"`
|
||||
DeviceId int `json:"DeviceId"`
|
||||
BusNumber int `json:"BusNumber"`
|
||||
DeviceNumber int `json:"DeviceNumber"`
|
||||
FunctionNumber int `json:"FunctionNumber"`
|
||||
MaxLinkWidth int `json:"MaxLinkWidth"`
|
||||
MaxLinkSpeed int `json:"MaxLinkSpeed"`
|
||||
NegotiatedLinkWidth int `json:"NegotiatedLinkWidth"`
|
||||
CurrentLinkSpeed int `json:"CurrentLinkSpeed"`
|
||||
ClassCode int `json:"ClassCode"`
|
||||
SubClassCode int `json:"SubClassCode"`
|
||||
PcieSlot int `json:"PcieSlot"`
|
||||
LocString string `json:"LocString"`
|
||||
PartNumber *string `json:"PartNumber"`
|
||||
SerialNumber *string `json:"SerialNumber"`
|
||||
Mac []string `json:"Mac"`
|
||||
} `json:"PcieInfo"`
|
||||
}
|
||||
|
||||
// ParseAssetJSON parses Inspur asset.json content
|
||||
func ParseAssetJSON(content []byte) (*models.HardwareConfig, error) {
|
||||
var asset AssetJSON
|
||||
if err := json.Unmarshal(content, &asset); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
config := &models.HardwareConfig{}
|
||||
|
||||
// Parse version info
|
||||
for _, v := range asset.VersionInfo {
|
||||
config.Firmware = append(config.Firmware, models.FirmwareInfo{
|
||||
DeviceName: v.DeviceName,
|
||||
Version: v.DeviceRevision,
|
||||
BuildTime: v.BuildTime,
|
||||
})
|
||||
}
|
||||
|
||||
// Parse CPU info
|
||||
for i, cpu := range asset.CpuInfo {
|
||||
config.CPUs = append(config.CPUs, models.CPU{
|
||||
Socket: i,
|
||||
Model: strings.TrimSpace(cpu.ProcessorName),
|
||||
Cores: cpu.Core,
|
||||
Threads: cpu.ThreadCount,
|
||||
FrequencyMHz: cpu.CurrentSpeed,
|
||||
MaxFreqMHz: cpu.TurboEnableMaxSpeed,
|
||||
L1CacheKB: cpu.L1Cache,
|
||||
L2CacheKB: cpu.L2Cache,
|
||||
L3CacheKB: cpu.L3Cache,
|
||||
TDP: cpu.CpuTdp,
|
||||
PPIN: cpu.PPIN,
|
||||
})
|
||||
}
|
||||
|
||||
// Parse memory info
|
||||
if len(asset.MemInfo.MemCommonInfo) > 0 {
|
||||
common := asset.MemInfo.MemCommonInfo[0]
|
||||
for i, dimm := range asset.MemInfo.DimmInfo {
|
||||
config.Memory = append(config.Memory, models.MemoryDIMM{
|
||||
Slot: i,
|
||||
SizeMB: common.PhysicalSize * 1024,
|
||||
Type: memoryTypeToString(common.MemoryType),
|
||||
SpeedMHz: common.CurrentSpeed,
|
||||
Manufacturer: common.Manufacturer,
|
||||
SerialNumber: dimm.SerialNumber,
|
||||
PartNumber: strings.TrimSpace(dimm.PartNumber),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// Parse storage info
|
||||
for _, hdd := range asset.HddInfo {
|
||||
storageType := "HDD"
|
||||
if hdd.DiskInterfaceType == 5 {
|
||||
storageType = "NVMe"
|
||||
} else if hdd.MediaType == 1 {
|
||||
storageType = "SSD"
|
||||
}
|
||||
|
||||
// Resolve manufacturer: try vendor ID first, then model name extraction
|
||||
modelName := strings.TrimSpace(hdd.ModelName)
|
||||
manufacturer := resolveManufacturer(hdd.Manufacturer, modelName)
|
||||
|
||||
config.Storage = append(config.Storage, models.Storage{
|
||||
Slot: hdd.LocationString,
|
||||
Type: storageType,
|
||||
Model: modelName,
|
||||
SizeGB: hdd.Capacity,
|
||||
SerialNumber: hdd.SerialNumber,
|
||||
Manufacturer: manufacturer,
|
||||
Firmware: hdd.FirmwareVersion,
|
||||
Interface: diskInterfaceToString(hdd.DiskInterfaceType),
|
||||
})
|
||||
}
|
||||
|
||||
// Parse PCIe info
|
||||
for _, pcie := range asset.PcieInfo {
|
||||
vendor, deviceName := pciids.DeviceInfo(pcie.VendorId, pcie.DeviceId)
|
||||
device := models.PCIeDevice{
|
||||
Slot: pcie.LocString,
|
||||
VendorID: pcie.VendorId,
|
||||
DeviceID: pcie.DeviceId,
|
||||
BDF: formatBDF(pcie.BusNumber, pcie.DeviceNumber, pcie.FunctionNumber),
|
||||
LinkWidth: pcie.NegotiatedLinkWidth,
|
||||
LinkSpeed: pcieLinkSpeedToString(pcie.CurrentLinkSpeed),
|
||||
MaxLinkWidth: pcie.MaxLinkWidth,
|
||||
MaxLinkSpeed: pcieLinkSpeedToString(pcie.MaxLinkSpeed),
|
||||
DeviceClass: pcieClassToString(pcie.ClassCode, pcie.SubClassCode),
|
||||
Manufacturer: vendor,
|
||||
}
|
||||
if pcie.PartNumber != nil {
|
||||
device.PartNumber = strings.TrimSpace(*pcie.PartNumber)
|
||||
}
|
||||
if pcie.SerialNumber != nil {
|
||||
device.SerialNumber = strings.TrimSpace(*pcie.SerialNumber)
|
||||
}
|
||||
if len(pcie.Mac) > 0 {
|
||||
device.MACAddresses = pcie.Mac
|
||||
}
|
||||
// Use device name from PCI IDs database if available
|
||||
if deviceName != "" {
|
||||
device.DeviceClass = deviceName
|
||||
}
|
||||
config.PCIeDevices = append(config.PCIeDevices, device)
|
||||
}
|
||||
|
||||
return config, nil
|
||||
}
|
||||
|
||||
func memoryTypeToString(memType int) string {
|
||||
switch memType {
|
||||
case 26:
|
||||
return "DDR4"
|
||||
case 34:
|
||||
return "DDR5"
|
||||
default:
|
||||
return "Unknown"
|
||||
}
|
||||
}
|
||||
|
||||
func diskInterfaceToString(ifType int) string {
|
||||
switch ifType {
|
||||
case 4:
|
||||
return "SATA"
|
||||
case 5:
|
||||
return "NVMe"
|
||||
case 6:
|
||||
return "SAS"
|
||||
default:
|
||||
return "Unknown"
|
||||
}
|
||||
}
|
||||
|
||||
func pcieLinkSpeedToString(speed int) string {
|
||||
switch speed {
|
||||
case 1:
|
||||
return "2.5 GT/s"
|
||||
case 2:
|
||||
return "5.0 GT/s"
|
||||
case 3:
|
||||
return "8.0 GT/s"
|
||||
case 4:
|
||||
return "16.0 GT/s"
|
||||
case 5:
|
||||
return "32.0 GT/s"
|
||||
default:
|
||||
return "Unknown"
|
||||
}
|
||||
}
|
||||
|
||||
func pcieClassToString(classCode, subClass int) string {
|
||||
switch classCode {
|
||||
case 1:
|
||||
switch subClass {
|
||||
case 0:
|
||||
return "SCSI"
|
||||
case 1:
|
||||
return "IDE"
|
||||
case 4:
|
||||
return "RAID"
|
||||
case 6:
|
||||
return "SATA"
|
||||
case 7:
|
||||
return "SAS"
|
||||
case 8:
|
||||
return "NVMe"
|
||||
default:
|
||||
return "Storage"
|
||||
}
|
||||
case 2:
|
||||
return "Network"
|
||||
case 3:
|
||||
switch subClass {
|
||||
case 0:
|
||||
return "VGA"
|
||||
case 2:
|
||||
return "3D Controller"
|
||||
default:
|
||||
return "Display"
|
||||
}
|
||||
case 4:
|
||||
return "Multimedia"
|
||||
case 6:
|
||||
return "Bridge"
|
||||
case 12:
|
||||
return "Serial Bus"
|
||||
default:
|
||||
return "Other"
|
||||
}
|
||||
}
|
||||
|
||||
func formatBDF(bus, dev, fun int) string {
|
||||
return fmt.Sprintf("%02x:%02x.%x", bus, dev, fun)
|
||||
}
|
||||
|
||||
// resolveManufacturer resolves manufacturer name from various sources
|
||||
func resolveManufacturer(rawManufacturer, modelName string) string {
|
||||
raw := strings.TrimSpace(rawManufacturer)
|
||||
|
||||
// If it looks like a vendor ID (hex), try to resolve it
|
||||
if raw != "" {
|
||||
if name := pciids.VendorNameFromString(raw); name != "" {
|
||||
return name
|
||||
}
|
||||
// If not a vendor ID but looks like a real name (has letters), use it
|
||||
hasLetter := false
|
||||
for _, c := range raw {
|
||||
if (c >= 'A' && c <= 'Z') || (c >= 'a' && c <= 'z') {
|
||||
hasLetter = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if hasLetter && len(raw) > 2 {
|
||||
return raw
|
||||
}
|
||||
}
|
||||
|
||||
// Try to extract from model name
|
||||
return extractStorageManufacturer(modelName)
|
||||
}
|
||||
|
||||
// extractStorageManufacturer tries to extract manufacturer from model name
|
||||
func extractStorageManufacturer(model string) string {
|
||||
modelUpper := strings.ToUpper(model)
|
||||
|
||||
knownVendors := []struct {
|
||||
prefix string
|
||||
name string
|
||||
}{
|
||||
{"SAMSUNG", "Samsung"},
|
||||
{"KIOXIA", "KIOXIA"},
|
||||
{"TOSHIBA", "Toshiba"},
|
||||
{"WDC", "Western Digital"},
|
||||
{"WD", "Western Digital"},
|
||||
{"SEAGATE", "Seagate"},
|
||||
{"HGST", "HGST"},
|
||||
{"INTEL", "Intel"},
|
||||
{"MICRON", "Micron"},
|
||||
{"KINGSTON", "Kingston"},
|
||||
{"CRUCIAL", "Crucial"},
|
||||
{"SK HYNIX", "SK Hynix"},
|
||||
{"SKHYNIX", "SK Hynix"},
|
||||
{"SANDISK", "SanDisk"},
|
||||
{"LITEON", "Lite-On"},
|
||||
{"PLEXTOR", "Plextor"},
|
||||
{"ADATA", "ADATA"},
|
||||
{"TRANSCEND", "Transcend"},
|
||||
{"CORSAIR", "Corsair"},
|
||||
{"SOLIDIGM", "Solidigm"},
|
||||
}
|
||||
|
||||
for _, v := range knownVendors {
|
||||
if strings.HasPrefix(modelUpper, v.prefix) {
|
||||
return v.name
|
||||
}
|
||||
}
|
||||
|
||||
return ""
|
||||
}
|
||||
147
internal/parser/vendors/inspur/component.go
vendored
Normal file
147
internal/parser/vendors/inspur/component.go
vendored
Normal file
@@ -0,0 +1,147 @@
|
||||
package inspur
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"regexp"
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"git.mchus.pro/mchus/logpile/internal/models"
|
||||
)
|
||||
|
||||
// ParseComponentLog parses component.log file and extracts PSU and other info
|
||||
func ParseComponentLog(content []byte, hw *models.HardwareConfig) {
|
||||
if hw == nil {
|
||||
return
|
||||
}
|
||||
|
||||
text := string(content)
|
||||
|
||||
// Parse RESTful PSU info
|
||||
parsePSUInfo(text, hw)
|
||||
}
|
||||
|
||||
// ParseComponentLogEvents extracts events from component.log (memory errors, etc.)
|
||||
func ParseComponentLogEvents(content []byte) []models.Event {
|
||||
var events []models.Event
|
||||
text := string(content)
|
||||
|
||||
// Parse RESTful Memory info for Warning/Error status
|
||||
memEvents := parseMemoryEvents(text)
|
||||
events = append(events, memEvents...)
|
||||
|
||||
return events
|
||||
}
|
||||
|
||||
// PSUInfo represents the RESTful PSU info structure
|
||||
type PSUInfo struct {
|
||||
PowerSupplies []struct {
|
||||
ID int `json:"id"`
|
||||
Present int `json:"present"`
|
||||
VendorID string `json:"vendor_id"`
|
||||
Model string `json:"model"`
|
||||
SerialNum string `json:"serial_num"`
|
||||
FwVer string `json:"fw_ver"`
|
||||
RatedPower int `json:"rated_power"`
|
||||
Status string `json:"status"`
|
||||
} `json:"power_supplies"`
|
||||
}
|
||||
|
||||
func parsePSUInfo(text string, hw *models.HardwareConfig) {
|
||||
// Find RESTful PSU info section
|
||||
re := regexp.MustCompile(`RESTful PSU info:\s*(\{[\s\S]*?\})\s*(?:RESTful|BMC|$)`)
|
||||
match := re.FindStringSubmatch(text)
|
||||
if match == nil {
|
||||
return
|
||||
}
|
||||
|
||||
jsonStr := match[1]
|
||||
// Clean up the JSON (it might have newlines)
|
||||
jsonStr = strings.ReplaceAll(jsonStr, "\n", "")
|
||||
|
||||
var psuInfo PSUInfo
|
||||
if err := json.Unmarshal([]byte(jsonStr), &psuInfo); err != nil {
|
||||
return
|
||||
}
|
||||
|
||||
// Clear existing PSU data and populate with RESTful data
|
||||
hw.PowerSupply = nil
|
||||
for _, psu := range psuInfo.PowerSupplies {
|
||||
if psu.Present != 1 {
|
||||
continue
|
||||
}
|
||||
hw.PowerSupply = append(hw.PowerSupply, models.PSU{
|
||||
Slot: formatPSUSlot(psu.ID),
|
||||
Model: psu.Model,
|
||||
WattageW: psu.RatedPower,
|
||||
SerialNumber: psu.SerialNum,
|
||||
Status: psu.Status,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func formatPSUSlot(id int) string {
|
||||
return fmt.Sprintf("PSU%d", id)
|
||||
}
|
||||
|
||||
// MemoryInfo represents the RESTful Memory info structure
|
||||
type MemoryInfo struct {
|
||||
MemModules []struct {
|
||||
MemModID int `json:"mem_mod_id"`
|
||||
MemModSlot string `json:"mem_mod_slot"`
|
||||
MemModSize int `json:"mem_mod_size"`
|
||||
MemModVendor string `json:"mem_mod_vendor"`
|
||||
MemModPartNum string `json:"mem_mod_part_num"`
|
||||
MemModSerial string `json:"mem_mod_serial_num"`
|
||||
Status string `json:"status"`
|
||||
} `json:"mem_modules"`
|
||||
}
|
||||
|
||||
func parseMemoryEvents(text string) []models.Event {
|
||||
var events []models.Event
|
||||
|
||||
// Find RESTful Memory info section
|
||||
re := regexp.MustCompile(`RESTful Memory info:\s*(\{[\s\S]*?\})\s*RESTful HDD`)
|
||||
match := re.FindStringSubmatch(text)
|
||||
if match == nil {
|
||||
return events
|
||||
}
|
||||
|
||||
jsonStr := match[1]
|
||||
jsonStr = strings.ReplaceAll(jsonStr, "\n", "")
|
||||
|
||||
var memInfo MemoryInfo
|
||||
if err := json.Unmarshal([]byte(jsonStr), &memInfo); err != nil {
|
||||
return events
|
||||
}
|
||||
|
||||
// Generate events for memory modules with Warning or Error status
|
||||
for _, mem := range memInfo.MemModules {
|
||||
if mem.Status == "Warning" || mem.Status == "Error" || mem.Status == "Critical" {
|
||||
severity := models.SeverityWarning
|
||||
if mem.Status == "Error" || mem.Status == "Critical" {
|
||||
severity = models.SeverityCritical
|
||||
}
|
||||
|
||||
description := fmt.Sprintf("Memory module %s: %s", mem.MemModSlot, mem.Status)
|
||||
if mem.MemModSize == 0 {
|
||||
description = fmt.Sprintf("Memory module %s not detected (capacity 0GB)", mem.MemModSlot)
|
||||
}
|
||||
|
||||
events = append(events, models.Event{
|
||||
ID: fmt.Sprintf("mem_%d", mem.MemModID),
|
||||
Timestamp: time.Now(), // No timestamp in source, use current
|
||||
Source: "Memory",
|
||||
SensorType: "memory",
|
||||
SensorName: mem.MemModSlot,
|
||||
EventType: "Memory Status",
|
||||
Severity: severity,
|
||||
Description: description,
|
||||
RawData: fmt.Sprintf("Slot: %s, Vendor: %s, P/N: %s, S/N: %s", mem.MemModSlot, mem.MemModVendor, mem.MemModPartNum, mem.MemModSerial),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
return events
|
||||
}
|
||||
97
internal/parser/vendors/inspur/fru.go
vendored
Normal file
97
internal/parser/vendors/inspur/fru.go
vendored
Normal file
@@ -0,0 +1,97 @@
|
||||
package inspur
|
||||
|
||||
import (
|
||||
"bufio"
|
||||
"regexp"
|
||||
"strings"
|
||||
|
||||
"git.mchus.pro/mchus/logpile/internal/models"
|
||||
)
|
||||
|
||||
var (
|
||||
fruDeviceRegex = regexp.MustCompile(`^FRU Device Description\s*:\s*(.+)$`)
|
||||
fruFieldRegex = regexp.MustCompile(`^\s+(.+?)\s*:\s*(.*)$`)
|
||||
)
|
||||
|
||||
// ParseFRU parses BMC FRU (Field Replaceable Unit) output
|
||||
func ParseFRU(content []byte) []models.FRUInfo {
|
||||
var fruList []models.FRUInfo
|
||||
var current *models.FRUInfo
|
||||
|
||||
scanner := bufio.NewScanner(strings.NewReader(string(content)))
|
||||
for scanner.Scan() {
|
||||
line := scanner.Text()
|
||||
|
||||
// Check for new FRU device
|
||||
if matches := fruDeviceRegex.FindStringSubmatch(line); matches != nil {
|
||||
if current != nil && current.Description != "" {
|
||||
fruList = append(fruList, *current)
|
||||
}
|
||||
current = &models.FRUInfo{
|
||||
Description: strings.TrimSpace(matches[1]),
|
||||
}
|
||||
continue
|
||||
}
|
||||
|
||||
// Skip if no current FRU device
|
||||
if current == nil {
|
||||
continue
|
||||
}
|
||||
|
||||
// Skip "Device not present" entries
|
||||
if strings.Contains(line, "Device not present") {
|
||||
current = nil
|
||||
continue
|
||||
}
|
||||
|
||||
// Parse FRU fields
|
||||
if matches := fruFieldRegex.FindStringSubmatch(line); matches != nil {
|
||||
fieldName := strings.TrimSpace(matches[1])
|
||||
fieldValue := strings.TrimSpace(matches[2])
|
||||
|
||||
switch fieldName {
|
||||
case "Chassis Type":
|
||||
current.ChassisType = fieldValue
|
||||
case "Chassis Part Number":
|
||||
if fieldValue != "0" {
|
||||
current.PartNumber = fieldValue
|
||||
}
|
||||
case "Chassis Serial":
|
||||
if fieldValue != "0" {
|
||||
current.SerialNumber = fieldValue
|
||||
}
|
||||
case "Board Mfg Date":
|
||||
current.MfgDate = fieldValue
|
||||
case "Board Mfg", "Product Manufacturer":
|
||||
if fieldValue != "NULL" {
|
||||
current.Manufacturer = fieldValue
|
||||
}
|
||||
case "Board Product", "Product Name":
|
||||
if fieldValue != "NULL" {
|
||||
current.ProductName = fieldValue
|
||||
}
|
||||
case "Board Serial", "Product Serial":
|
||||
current.SerialNumber = fieldValue
|
||||
case "Board Part Number", "Product Part Number":
|
||||
if fieldValue != "0" {
|
||||
current.PartNumber = fieldValue
|
||||
}
|
||||
case "Product Version":
|
||||
if fieldValue != "0" {
|
||||
current.Version = fieldValue
|
||||
}
|
||||
case "Product Asset Tag":
|
||||
if fieldValue != "NULL" {
|
||||
current.AssetTag = fieldValue
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Don't forget the last one
|
||||
if current != nil && current.Description != "" {
|
||||
fruList = append(fruList, *current)
|
||||
}
|
||||
|
||||
return fruList
|
||||
}
|
||||
123
internal/parser/vendors/inspur/idl.go
vendored
Normal file
123
internal/parser/vendors/inspur/idl.go
vendored
Normal file
@@ -0,0 +1,123 @@
|
||||
package inspur
|
||||
|
||||
import (
|
||||
"regexp"
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"git.mchus.pro/mchus/logpile/internal/models"
|
||||
)
|
||||
|
||||
// ParseIDLLog parses the IDL (Inspur Diagnostic Log) file for BMC alarms
|
||||
// Format: |timestamp|component|type|severity|eventID|description|
|
||||
func ParseIDLLog(content []byte) []models.Event {
|
||||
var events []models.Event
|
||||
|
||||
// Pattern to match CommerDiagnose log entries
|
||||
// Example: |2025-12-02T17:54:27+08:00|MEMORY|Assert|Warning|0C180401|CPU1_C4D0 Memory Device Disabled...|
|
||||
re := regexp.MustCompile(`\|(\d{4}-\d{2}-\d{2}T[\d:]+[+-]\d{2}:\d{2})\|([^|]+)\|([^|]+)\|([^|]+)\|([^|]+)\|([^|]+)\|`)
|
||||
|
||||
lines := strings.Split(string(content), "\n")
|
||||
seenEvents := make(map[string]bool) // Deduplicate events
|
||||
|
||||
for _, line := range lines {
|
||||
if !strings.Contains(line, "CommerDiagnose") {
|
||||
continue
|
||||
}
|
||||
|
||||
matches := re.FindStringSubmatch(line)
|
||||
if matches == nil {
|
||||
continue
|
||||
}
|
||||
|
||||
timestamp := matches[1]
|
||||
component := matches[2]
|
||||
eventType := matches[3]
|
||||
severityStr := matches[4]
|
||||
eventID := matches[5]
|
||||
description := matches[6]
|
||||
|
||||
// Parse timestamp
|
||||
ts, err := time.Parse("2006-01-02T15:04:05-07:00", timestamp)
|
||||
if err != nil {
|
||||
ts = time.Now()
|
||||
}
|
||||
|
||||
// Map severity
|
||||
severity := mapIDLSeverity(severityStr)
|
||||
|
||||
// Clean up description
|
||||
description = cleanDescription(description)
|
||||
|
||||
// Create unique key for deduplication
|
||||
eventKey := eventID + "|" + description
|
||||
if seenEvents[eventKey] {
|
||||
continue
|
||||
}
|
||||
seenEvents[eventKey] = true
|
||||
|
||||
// Extract sensor name from description if available
|
||||
sensorName := extractSensorName(description, component)
|
||||
|
||||
events = append(events, models.Event{
|
||||
ID: eventID,
|
||||
Timestamp: ts,
|
||||
Source: component,
|
||||
SensorType: strings.ToLower(component),
|
||||
SensorName: sensorName,
|
||||
EventType: eventType,
|
||||
Severity: severity,
|
||||
Description: description,
|
||||
})
|
||||
}
|
||||
|
||||
return events
|
||||
}
|
||||
|
||||
func mapIDLSeverity(s string) models.Severity {
|
||||
switch strings.ToLower(s) {
|
||||
case "critical", "error":
|
||||
return models.SeverityCritical
|
||||
case "warning":
|
||||
return models.SeverityWarning
|
||||
default:
|
||||
return models.SeverityInfo
|
||||
}
|
||||
}
|
||||
|
||||
func cleanDescription(desc string) string {
|
||||
// Remove trailing " - Assert" or similar
|
||||
desc = strings.TrimSuffix(desc, " - Assert")
|
||||
desc = strings.TrimSuffix(desc, " - Deassert")
|
||||
desc = strings.TrimSpace(desc)
|
||||
return desc
|
||||
}
|
||||
|
||||
func extractSensorName(desc, component string) string {
|
||||
// Try to extract sensor/device name from description
|
||||
// For memory: CPU1_C4D0, CPU1_C4D1, etc.
|
||||
if component == "MEMORY" {
|
||||
re := regexp.MustCompile(`(CPU\d+_C\d+D\d+)`)
|
||||
if matches := re.FindStringSubmatch(desc); matches != nil {
|
||||
return matches[1]
|
||||
}
|
||||
}
|
||||
|
||||
// For PSU: PSU0, PSU1, etc.
|
||||
if component == "PSU" || component == "POWER" {
|
||||
re := regexp.MustCompile(`(PSU\d+)`)
|
||||
if matches := re.FindStringSubmatch(desc); matches != nil {
|
||||
return matches[1]
|
||||
}
|
||||
}
|
||||
|
||||
// For temperature sensors
|
||||
if component == "TEMPERATURE" || component == "THERMAL" {
|
||||
re := regexp.MustCompile(`(\w+_Temp|\w+_DTS)`)
|
||||
if matches := re.FindStringSubmatch(desc); matches != nil {
|
||||
return matches[1]
|
||||
}
|
||||
}
|
||||
|
||||
return component
|
||||
}
|
||||
143
internal/parser/vendors/inspur/parser.go
vendored
Normal file
143
internal/parser/vendors/inspur/parser.go
vendored
Normal file
@@ -0,0 +1,143 @@
|
||||
// Package inspur provides parser for Inspur/Kaytus BMC diagnostic archives
|
||||
// Tested with: Kaytus KR4268X2 (onekeylog format)
|
||||
package inspur
|
||||
|
||||
import (
|
||||
"strings"
|
||||
|
||||
"git.mchus.pro/mchus/logpile/internal/models"
|
||||
"git.mchus.pro/mchus/logpile/internal/parser"
|
||||
)
|
||||
|
||||
func init() {
|
||||
parser.Register(&Parser{})
|
||||
}
|
||||
|
||||
// Parser implements VendorParser for Inspur/Kaytus servers
|
||||
type Parser struct{}
|
||||
|
||||
// Name returns human-readable parser name
|
||||
func (p *Parser) Name() string {
|
||||
return "Inspur/Kaytus BMC Parser"
|
||||
}
|
||||
|
||||
// Vendor returns vendor identifier
|
||||
func (p *Parser) Vendor() string {
|
||||
return "inspur"
|
||||
}
|
||||
|
||||
// Detect checks if archive matches Inspur/Kaytus format
|
||||
// Returns confidence 0-100
|
||||
func (p *Parser) Detect(files []parser.ExtractedFile) int {
|
||||
confidence := 0
|
||||
|
||||
for _, f := range files {
|
||||
path := strings.ToLower(f.Path)
|
||||
|
||||
// Strong indicators for Inspur/Kaytus onekeylog format
|
||||
if strings.Contains(path, "onekeylog/") {
|
||||
confidence += 30
|
||||
}
|
||||
if strings.Contains(path, "devicefrusdr.log") {
|
||||
confidence += 25
|
||||
}
|
||||
if strings.Contains(path, "component/component.log") {
|
||||
confidence += 15
|
||||
}
|
||||
|
||||
// Check for asset.json with Inspur-specific structure
|
||||
if strings.HasSuffix(path, "asset.json") {
|
||||
if containsInspurMarkers(f.Content) {
|
||||
confidence += 20
|
||||
}
|
||||
}
|
||||
|
||||
// Cap at 100
|
||||
if confidence >= 100 {
|
||||
return 100
|
||||
}
|
||||
}
|
||||
|
||||
return confidence
|
||||
}
|
||||
|
||||
// containsInspurMarkers checks if content has Inspur-specific markers
|
||||
func containsInspurMarkers(content []byte) bool {
|
||||
s := string(content)
|
||||
// Check for typical Inspur asset.json structure
|
||||
return strings.Contains(s, "VersionInfo") &&
|
||||
strings.Contains(s, "CpuInfo") &&
|
||||
strings.Contains(s, "MemInfo")
|
||||
}
|
||||
|
||||
// Parse parses Inspur/Kaytus archive
|
||||
func (p *Parser) Parse(files []parser.ExtractedFile) (*models.AnalysisResult, error) {
|
||||
result := &models.AnalysisResult{
|
||||
Events: make([]models.Event, 0),
|
||||
FRU: make([]models.FRUInfo, 0),
|
||||
Sensors: make([]models.SensorReading, 0),
|
||||
}
|
||||
|
||||
// Parse devicefrusdr.log (contains SDR and FRU data)
|
||||
if f := parser.FindFileByName(files, "devicefrusdr.log"); f != nil {
|
||||
p.parseDeviceFruSDR(f.Content, result)
|
||||
}
|
||||
|
||||
// Parse asset.json
|
||||
if f := parser.FindFileByName(files, "asset.json"); f != nil {
|
||||
if hw, err := ParseAssetJSON(f.Content); err == nil {
|
||||
result.Hardware = hw
|
||||
}
|
||||
}
|
||||
|
||||
// Parse component.log for additional data (PSU, etc.)
|
||||
if f := parser.FindFileByName(files, "component.log"); f != nil {
|
||||
if result.Hardware == nil {
|
||||
result.Hardware = &models.HardwareConfig{}
|
||||
}
|
||||
ParseComponentLog(f.Content, result.Hardware)
|
||||
|
||||
// Extract events from component.log (memory errors, etc.)
|
||||
componentEvents := ParseComponentLogEvents(f.Content)
|
||||
result.Events = append(result.Events, componentEvents...)
|
||||
}
|
||||
|
||||
// Parse IDL log (BMC alarms/diagnose events)
|
||||
if f := parser.FindFileByName(files, "idl.log"); f != nil {
|
||||
idlEvents := ParseIDLLog(f.Content)
|
||||
result.Events = append(result.Events, idlEvents...)
|
||||
}
|
||||
|
||||
// Parse syslog files
|
||||
syslogFiles := parser.FindFileByPattern(files, "syslog/alert", "syslog/warning", "syslog/notice", "syslog/info")
|
||||
for _, f := range syslogFiles {
|
||||
events := ParseSyslog(f.Content, f.Path)
|
||||
result.Events = append(result.Events, events...)
|
||||
}
|
||||
|
||||
return result, nil
|
||||
}
|
||||
|
||||
func (p *Parser) parseDeviceFruSDR(content []byte, result *models.AnalysisResult) {
|
||||
lines := string(content)
|
||||
|
||||
// Find SDR section
|
||||
sdrStart := strings.Index(lines, "BMC sdr Info:")
|
||||
fruStart := strings.Index(lines, "BMC fru Info:")
|
||||
|
||||
if sdrStart != -1 {
|
||||
var sdrContent string
|
||||
if fruStart != -1 && fruStart > sdrStart {
|
||||
sdrContent = lines[sdrStart:fruStart]
|
||||
} else {
|
||||
sdrContent = lines[sdrStart:]
|
||||
}
|
||||
result.Sensors = ParseSDR([]byte(sdrContent))
|
||||
}
|
||||
|
||||
// Find FRU section
|
||||
if fruStart != -1 {
|
||||
fruContent := lines[fruStart:]
|
||||
result.FRU = ParseFRU([]byte(fruContent))
|
||||
}
|
||||
}
|
||||
89
internal/parser/vendors/inspur/sdr.go
vendored
Normal file
89
internal/parser/vendors/inspur/sdr.go
vendored
Normal file
@@ -0,0 +1,89 @@
|
||||
package inspur
|
||||
|
||||
import (
|
||||
"bufio"
|
||||
"regexp"
|
||||
"strconv"
|
||||
"strings"
|
||||
|
||||
"git.mchus.pro/mchus/logpile/internal/models"
|
||||
)
|
||||
|
||||
// SDR sensor reading patterns
|
||||
var (
|
||||
sdrLineRegex = regexp.MustCompile(`^(\S+)\s+\|\s+(.+?)\s+\|\s+(\w+)$`)
|
||||
valueRegex = regexp.MustCompile(`^([\d.]+)\s+(.+)$`)
|
||||
)
|
||||
|
||||
// ParseSDR parses BMC SDR (Sensor Data Record) output
|
||||
func ParseSDR(content []byte) []models.SensorReading {
|
||||
var readings []models.SensorReading
|
||||
|
||||
scanner := bufio.NewScanner(strings.NewReader(string(content)))
|
||||
for scanner.Scan() {
|
||||
line := strings.TrimSpace(scanner.Text())
|
||||
if line == "" || strings.HasPrefix(line, "BMC sdr Info:") {
|
||||
continue
|
||||
}
|
||||
|
||||
matches := sdrLineRegex.FindStringSubmatch(line)
|
||||
if matches == nil {
|
||||
continue
|
||||
}
|
||||
|
||||
name := strings.TrimSpace(matches[1])
|
||||
valueStr := strings.TrimSpace(matches[2])
|
||||
status := strings.TrimSpace(matches[3])
|
||||
|
||||
reading := models.SensorReading{
|
||||
Name: name,
|
||||
Status: status,
|
||||
}
|
||||
|
||||
// Parse value and unit
|
||||
if valueStr != "disabled" && valueStr != "no reading" && !strings.HasPrefix(valueStr, "0x") {
|
||||
if vm := valueRegex.FindStringSubmatch(valueStr); vm != nil {
|
||||
if v, err := strconv.ParseFloat(vm[1], 64); err == nil {
|
||||
reading.Value = v
|
||||
reading.Unit = strings.TrimSpace(vm[2])
|
||||
}
|
||||
}
|
||||
} else if strings.HasPrefix(valueStr, "0x") {
|
||||
reading.RawValue = valueStr
|
||||
}
|
||||
|
||||
// Determine sensor type
|
||||
reading.Type = determineSensorType(name)
|
||||
|
||||
readings = append(readings, reading)
|
||||
}
|
||||
|
||||
return readings
|
||||
}
|
||||
|
||||
func determineSensorType(name string) string {
|
||||
nameLower := strings.ToLower(name)
|
||||
|
||||
switch {
|
||||
case strings.Contains(nameLower, "temp"):
|
||||
return "temperature"
|
||||
case strings.Contains(nameLower, "fan") && strings.Contains(nameLower, "speed"):
|
||||
return "fan_speed"
|
||||
case strings.Contains(nameLower, "fan") && strings.Contains(nameLower, "status"):
|
||||
return "fan_status"
|
||||
case strings.HasSuffix(nameLower, "_vin") || strings.HasSuffix(nameLower, "_vout") ||
|
||||
strings.HasSuffix(nameLower, "_v") || strings.Contains(nameLower, "volt"):
|
||||
return "voltage"
|
||||
case strings.Contains(nameLower, "power") || strings.HasSuffix(nameLower, "_pin") ||
|
||||
strings.HasSuffix(nameLower, "_pout") || strings.HasSuffix(nameLower, "_pwr"):
|
||||
return "power"
|
||||
case strings.Contains(nameLower, "psu") && strings.Contains(nameLower, "status"):
|
||||
return "psu_status"
|
||||
case strings.Contains(nameLower, "cpu") && strings.Contains(nameLower, "status"):
|
||||
return "cpu_status"
|
||||
case strings.Contains(nameLower, "hdd") || strings.Contains(nameLower, "nvme"):
|
||||
return "storage_status"
|
||||
default:
|
||||
return "other"
|
||||
}
|
||||
}
|
||||
97
internal/parser/vendors/inspur/syslog.go
vendored
Normal file
97
internal/parser/vendors/inspur/syslog.go
vendored
Normal file
@@ -0,0 +1,97 @@
|
||||
package inspur
|
||||
|
||||
import (
|
||||
"bufio"
|
||||
"regexp"
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"git.mchus.pro/mchus/logpile/internal/models"
|
||||
)
|
||||
|
||||
var (
|
||||
// Syslog format: <priority> timestamp hostname process: message
|
||||
syslogRegex = regexp.MustCompile(`^<(\d+)>\s*(\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}[^\s]*)\s+(\S+)\s+(\S+):\s*(.*)$`)
|
||||
)
|
||||
|
||||
// ParseSyslog parses syslog format logs
|
||||
func ParseSyslog(content []byte, sourcePath string) []models.Event {
|
||||
var events []models.Event
|
||||
|
||||
// Determine severity from file path
|
||||
severity := determineSeverityFromPath(sourcePath)
|
||||
|
||||
scanner := bufio.NewScanner(strings.NewReader(string(content)))
|
||||
lineNum := 0
|
||||
|
||||
for scanner.Scan() {
|
||||
lineNum++
|
||||
line := strings.TrimSpace(scanner.Text())
|
||||
if line == "" {
|
||||
continue
|
||||
}
|
||||
|
||||
matches := syslogRegex.FindStringSubmatch(line)
|
||||
if matches == nil {
|
||||
continue
|
||||
}
|
||||
|
||||
timestamp, err := time.Parse(time.RFC3339, matches[2])
|
||||
if err != nil {
|
||||
// Try alternative format
|
||||
timestamp, err = time.Parse("2006-01-02T15:04:05.000000-07:00", matches[2])
|
||||
if err != nil {
|
||||
continue
|
||||
}
|
||||
}
|
||||
|
||||
event := models.Event{
|
||||
ID: generateEventID(sourcePath, lineNum),
|
||||
Timestamp: timestamp,
|
||||
Source: matches[4],
|
||||
SensorType: "syslog",
|
||||
SensorName: matches[3],
|
||||
Description: matches[5],
|
||||
Severity: severity,
|
||||
RawData: line,
|
||||
}
|
||||
|
||||
events = append(events, event)
|
||||
}
|
||||
|
||||
return events
|
||||
}
|
||||
|
||||
func determineSeverityFromPath(path string) models.Severity {
|
||||
pathLower := strings.ToLower(path)
|
||||
|
||||
switch {
|
||||
case strings.Contains(pathLower, "emerg") || strings.Contains(pathLower, "alert") ||
|
||||
strings.Contains(pathLower, "crit"):
|
||||
return models.SeverityCritical
|
||||
case strings.Contains(pathLower, "warn") || strings.Contains(pathLower, "error"):
|
||||
return models.SeverityWarning
|
||||
default:
|
||||
return models.SeverityInfo
|
||||
}
|
||||
}
|
||||
|
||||
func generateEventID(source string, lineNum int) string {
|
||||
parts := strings.Split(source, "/")
|
||||
filename := parts[len(parts)-1]
|
||||
return strings.TrimSuffix(filename, ".log") + "_" + itoa(lineNum)
|
||||
}
|
||||
|
||||
func itoa(i int) string {
|
||||
if i == 0 {
|
||||
return "0"
|
||||
}
|
||||
var b [20]byte
|
||||
pos := len(b)
|
||||
for i > 0 {
|
||||
pos--
|
||||
b[pos] = byte('0' + i%10)
|
||||
i /= 10
|
||||
}
|
||||
return string(b[pos:])
|
||||
}
|
||||
177
internal/parser/vendors/pciids/pciids.go
vendored
Normal file
177
internal/parser/vendors/pciids/pciids.go
vendored
Normal file
@@ -0,0 +1,177 @@
|
||||
package pciids
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"strings"
|
||||
)
|
||||
|
||||
// VendorName returns vendor name by PCI Vendor ID
|
||||
func VendorName(vendorID int) string {
|
||||
if name, ok := vendors[vendorID]; ok {
|
||||
return name
|
||||
}
|
||||
return ""
|
||||
}
|
||||
|
||||
// DeviceName returns device name by Vendor ID and Device ID
|
||||
func DeviceName(vendorID, deviceID int) string {
|
||||
key := fmt.Sprintf("%04x:%04x", vendorID, deviceID)
|
||||
if name, ok := devices[key]; ok {
|
||||
return name
|
||||
}
|
||||
return ""
|
||||
}
|
||||
|
||||
// DeviceInfo returns both vendor and device name
|
||||
func DeviceInfo(vendorID, deviceID int) (vendor, device string) {
|
||||
vendor = VendorName(vendorID)
|
||||
device = DeviceName(vendorID, deviceID)
|
||||
return
|
||||
}
|
||||
|
||||
// VendorNameFromString tries to parse vendor ID from string (hex) and return name
|
||||
func VendorNameFromString(s string) string {
|
||||
s = strings.TrimSpace(s)
|
||||
if s == "" {
|
||||
return ""
|
||||
}
|
||||
|
||||
// Try to parse as hex (with or without 0x prefix)
|
||||
s = strings.TrimPrefix(strings.ToLower(s), "0x")
|
||||
|
||||
var id int
|
||||
for _, c := range s {
|
||||
if c >= '0' && c <= '9' {
|
||||
id = id*16 + int(c-'0')
|
||||
} else if c >= 'a' && c <= 'f' {
|
||||
id = id*16 + int(c-'a'+10)
|
||||
} else {
|
||||
// Not a valid hex string, return original
|
||||
return ""
|
||||
}
|
||||
}
|
||||
|
||||
return VendorName(id)
|
||||
}
|
||||
|
||||
// Common PCI Vendor IDs
|
||||
// Source: https://pci-ids.ucw.cz/
|
||||
var vendors = map[int]string{
|
||||
// Storage controllers and SSDs
|
||||
0x1E0F: "KIOXIA",
|
||||
0x144D: "Samsung Electronics",
|
||||
0x1C5C: "SK Hynix",
|
||||
0x15B7: "SanDisk (Western Digital)",
|
||||
0x1179: "Toshiba",
|
||||
0x8086: "Intel",
|
||||
0x1344: "Micron Technology",
|
||||
0x126F: "Silicon Motion",
|
||||
0x1987: "Phison Electronics",
|
||||
0x1CC1: "ADATA Technology",
|
||||
0x2646: "Kingston Technology",
|
||||
0x1E95: "Solid State Storage Technology",
|
||||
0x025E: "Solidigm",
|
||||
0x1D97: "Shenzhen Longsys Electronics",
|
||||
0x1E4B: "MAXIO Technology",
|
||||
|
||||
// Network adapters
|
||||
0x15B3: "Mellanox Technologies",
|
||||
0x14E4: "Broadcom",
|
||||
0x10EC: "Realtek Semiconductor",
|
||||
0x1077: "QLogic",
|
||||
0x19A2: "Emulex",
|
||||
0x1137: "Cisco Systems",
|
||||
0x1924: "Solarflare Communications",
|
||||
0x177D: "Cavium",
|
||||
0x1D6A: "Aquantia",
|
||||
0x1FC9: "Tehuti Networks",
|
||||
0x18D4: "Chelsio Communications",
|
||||
|
||||
// GPU / Graphics
|
||||
0x10DE: "NVIDIA",
|
||||
0x1002: "AMD/ATI",
|
||||
0x102B: "Matrox Electronics",
|
||||
0x1A03: "ASPEED Technology",
|
||||
|
||||
// Storage controllers (RAID/HBA)
|
||||
0x1000: "LSI Logic / Broadcom",
|
||||
0x9005: "Adaptec / Microsemi",
|
||||
0x1028: "Dell",
|
||||
0x103C: "Hewlett-Packard",
|
||||
0x17D3: "Areca Technology",
|
||||
0x1CC4: "Union Memory",
|
||||
|
||||
// Server vendors
|
||||
0x1014: "IBM",
|
||||
0x15D9: "Supermicro",
|
||||
0x8088: "Inspur",
|
||||
|
||||
// Other common
|
||||
0x1022: "AMD",
|
||||
0x1106: "VIA Technologies",
|
||||
0x10B5: "PLX Technology",
|
||||
0x1B21: "ASMedia Technology",
|
||||
0x1B4B: "Marvell Technology",
|
||||
0x197B: "JMicron Technology",
|
||||
}
|
||||
|
||||
// Device IDs (vendor:device -> name)
|
||||
var devices = map[string]string{
|
||||
// NVIDIA GPUs (0x10DE)
|
||||
"10de:26b9": "L40S 48GB",
|
||||
"10de:26b1": "L40 48GB",
|
||||
"10de:2684": "RTX 4090",
|
||||
"10de:2704": "RTX 4080",
|
||||
"10de:2782": "RTX 4070 Ti",
|
||||
"10de:2786": "RTX 4070",
|
||||
"10de:27b8": "RTX 4060 Ti",
|
||||
"10de:2882": "RTX 4060",
|
||||
"10de:2204": "RTX 3090",
|
||||
"10de:2208": "RTX 3080 Ti",
|
||||
"10de:2206": "RTX 3080",
|
||||
"10de:2484": "RTX 3070",
|
||||
"10de:2503": "RTX 3060",
|
||||
"10de:20b0": "A100 80GB",
|
||||
"10de:20b2": "A100 40GB",
|
||||
"10de:20f1": "A10",
|
||||
"10de:2236": "A10G",
|
||||
"10de:25b6": "A16",
|
||||
"10de:20b5": "A30",
|
||||
"10de:20b7": "A30X",
|
||||
"10de:1db4": "V100 32GB",
|
||||
"10de:1db1": "V100 16GB",
|
||||
"10de:1e04": "RTX 2080 Ti",
|
||||
"10de:1e07": "RTX 2080",
|
||||
"10de:1f02": "RTX 2070",
|
||||
"10de:26ba": "L40S-PCIE-48G",
|
||||
"10de:2330": "H100 80GB PCIe",
|
||||
"10de:2331": "H100 80GB SXM5",
|
||||
"10de:2322": "H100 NVL",
|
||||
"10de:2324": "H200",
|
||||
|
||||
// AMD GPUs (0x1002)
|
||||
"1002:744c": "Instinct MI250X",
|
||||
"1002:7408": "Instinct MI100",
|
||||
"1002:73a5": "RX 6950 XT",
|
||||
"1002:73bf": "RX 6900 XT",
|
||||
"1002:73df": "RX 6700 XT",
|
||||
"1002:7480": "RX 7900 XTX",
|
||||
"1002:7483": "RX 7900 XT",
|
||||
|
||||
// ASPEED (0x1A03) - BMC VGA
|
||||
"1a03:2000": "AST2500 VGA",
|
||||
"1a03:1150": "AST2600 VGA",
|
||||
|
||||
// Intel GPUs
|
||||
"8086:56c0": "Data Center GPU Flex 170",
|
||||
"8086:56c1": "Data Center GPU Flex 140",
|
||||
|
||||
// Mellanox/NVIDIA NICs (0x15B3)
|
||||
"15b3:1017": "ConnectX-5 100GbE",
|
||||
"15b3:1019": "ConnectX-5 Ex",
|
||||
"15b3:101b": "ConnectX-6",
|
||||
"15b3:101d": "ConnectX-6 Dx",
|
||||
"15b3:101f": "ConnectX-6 Lx",
|
||||
"15b3:1021": "ConnectX-7",
|
||||
"15b3:a2d6": "ConnectX-4 Lx",
|
||||
}
|
||||
70
internal/parser/vendors/template/parser.go.template
vendored
Normal file
70
internal/parser/vendors/template/parser.go.template
vendored
Normal file
@@ -0,0 +1,70 @@
|
||||
// Package VENDORNAME provides parser for VENDOR_DESCRIPTION BMC diagnostic archives
|
||||
// Copy this template to create a new vendor parser module
|
||||
package VENDORNAME
|
||||
|
||||
import (
|
||||
"strings"
|
||||
|
||||
"git.mchus.pro/mchus/logpile/internal/models"
|
||||
"git.mchus.pro/mchus/logpile/internal/parser"
|
||||
)
|
||||
|
||||
func init() {
|
||||
parser.Register(&Parser{})
|
||||
}
|
||||
|
||||
// Parser implements VendorParser for VENDOR_DESCRIPTION servers
|
||||
type Parser struct{}
|
||||
|
||||
// Name returns human-readable parser name
|
||||
func (p *Parser) Name() string {
|
||||
return "VENDOR_DESCRIPTION BMC Parser"
|
||||
}
|
||||
|
||||
// Vendor returns vendor identifier
|
||||
func (p *Parser) Vendor() string {
|
||||
return "VENDORNAME"
|
||||
}
|
||||
|
||||
// Detect checks if archive matches this vendor's format
|
||||
// Returns confidence 0-100
|
||||
func (p *Parser) Detect(files []parser.ExtractedFile) int {
|
||||
confidence := 0
|
||||
|
||||
for _, f := range files {
|
||||
path := strings.ToLower(f.Path)
|
||||
|
||||
// Add detection logic here
|
||||
// Example:
|
||||
// if strings.Contains(path, "unique_vendor_file.log") {
|
||||
// confidence += 50
|
||||
// }
|
||||
_ = path
|
||||
}
|
||||
|
||||
// Cap at 100
|
||||
if confidence > 100 {
|
||||
return 100
|
||||
}
|
||||
|
||||
return confidence
|
||||
}
|
||||
|
||||
// Parse parses the archive using vendor-specific logic
|
||||
func (p *Parser) Parse(files []parser.ExtractedFile) (*models.AnalysisResult, error) {
|
||||
result := &models.AnalysisResult{
|
||||
Events: make([]models.Event, 0),
|
||||
FRU: make([]models.FRUInfo, 0),
|
||||
Sensors: make([]models.SensorReading, 0),
|
||||
}
|
||||
|
||||
// Add parsing logic here
|
||||
// Example:
|
||||
// if f := parser.FindFileByName(files, "sensor_data.log"); f != nil {
|
||||
// result.Sensors = parseSensorLog(f.Content)
|
||||
// }
|
||||
|
||||
return result, nil
|
||||
}
|
||||
|
||||
// Add helper functions for parsing specific file formats below
|
||||
14
internal/parser/vendors/vendors.go
vendored
Normal file
14
internal/parser/vendors/vendors.go
vendored
Normal file
@@ -0,0 +1,14 @@
|
||||
// Package vendors imports all vendor parser modules
|
||||
// Add new vendor imports here to register them
|
||||
package vendors
|
||||
|
||||
import (
|
||||
// Import vendor modules to trigger their init() registration
|
||||
_ "git.mchus.pro/mchus/logpile/internal/parser/vendors/inspur"
|
||||
|
||||
// Future vendors:
|
||||
// _ "git.mchus.pro/mchus/logpile/internal/parser/vendors/supermicro"
|
||||
// _ "git.mchus.pro/mchus/logpile/internal/parser/vendors/dell"
|
||||
// _ "git.mchus.pro/mchus/logpile/internal/parser/vendors/hpe"
|
||||
// _ "git.mchus.pro/mchus/logpile/internal/parser/vendors/lenovo"
|
||||
)
|
||||
461
internal/server/handlers.go
Normal file
461
internal/server/handlers.go
Normal file
@@ -0,0 +1,461 @@
|
||||
package server
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"html/template"
|
||||
"net/http"
|
||||
"strings"
|
||||
|
||||
"git.mchus.pro/mchus/logpile/internal/exporter"
|
||||
"git.mchus.pro/mchus/logpile/internal/models"
|
||||
"git.mchus.pro/mchus/logpile/internal/parser"
|
||||
)
|
||||
|
||||
func (s *Server) handleIndex(w http.ResponseWriter, r *http.Request) {
|
||||
if r.URL.Path != "/" {
|
||||
http.NotFound(w, r)
|
||||
return
|
||||
}
|
||||
|
||||
tmplContent, err := WebFS.ReadFile("templates/index.html")
|
||||
if err != nil {
|
||||
http.Error(w, "Template not found", http.StatusInternalServerError)
|
||||
return
|
||||
}
|
||||
|
||||
tmpl, err := template.New("index").Parse(string(tmplContent))
|
||||
if err != nil {
|
||||
http.Error(w, "Template parse error", http.StatusInternalServerError)
|
||||
return
|
||||
}
|
||||
|
||||
w.Header().Set("Content-Type", "text/html; charset=utf-8")
|
||||
tmpl.Execute(w, nil)
|
||||
}
|
||||
|
||||
func (s *Server) handleUpload(w http.ResponseWriter, r *http.Request) {
|
||||
// Max 100MB file
|
||||
if err := r.ParseMultipartForm(100 << 20); err != nil {
|
||||
jsonError(w, "File too large", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
|
||||
file, header, err := r.FormFile("archive")
|
||||
if err != nil {
|
||||
jsonError(w, "Failed to read file", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
defer file.Close()
|
||||
|
||||
// Parse archive
|
||||
p := parser.NewBMCParser()
|
||||
if err := p.ParseFromReader(file, header.Filename); err != nil {
|
||||
jsonError(w, "Failed to parse archive: "+err.Error(), http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
|
||||
result := p.Result()
|
||||
s.SetResult(result)
|
||||
s.SetDetectedVendor(p.DetectedVendor())
|
||||
|
||||
jsonResponse(w, map[string]interface{}{
|
||||
"status": "ok",
|
||||
"message": "File uploaded and parsed successfully",
|
||||
"filename": header.Filename,
|
||||
"vendor": p.DetectedVendor(),
|
||||
"stats": map[string]int{
|
||||
"events": len(result.Events),
|
||||
"sensors": len(result.Sensors),
|
||||
"fru": len(result.FRU),
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
func (s *Server) handleGetParsers(w http.ResponseWriter, r *http.Request) {
|
||||
jsonResponse(w, map[string]interface{}{
|
||||
"parsers": parser.ListParsers(),
|
||||
})
|
||||
}
|
||||
|
||||
func (s *Server) handleGetEvents(w http.ResponseWriter, r *http.Request) {
|
||||
result := s.GetResult()
|
||||
if result == nil {
|
||||
jsonResponse(w, []interface{}{})
|
||||
return
|
||||
}
|
||||
jsonResponse(w, result.Events)
|
||||
}
|
||||
|
||||
func (s *Server) handleGetSensors(w http.ResponseWriter, r *http.Request) {
|
||||
result := s.GetResult()
|
||||
if result == nil {
|
||||
jsonResponse(w, []interface{}{})
|
||||
return
|
||||
}
|
||||
jsonResponse(w, result.Sensors)
|
||||
}
|
||||
|
||||
func (s *Server) handleGetConfig(w http.ResponseWriter, r *http.Request) {
|
||||
result := s.GetResult()
|
||||
if result == nil || result.Hardware == nil {
|
||||
jsonResponse(w, map[string]interface{}{})
|
||||
return
|
||||
}
|
||||
|
||||
// Build specification summary
|
||||
spec := buildSpecification(result)
|
||||
|
||||
jsonResponse(w, map[string]interface{}{
|
||||
"hardware": result.Hardware,
|
||||
"specification": spec,
|
||||
})
|
||||
}
|
||||
|
||||
// SpecLine represents a single line in specification
|
||||
type SpecLine struct {
|
||||
Category string `json:"category"`
|
||||
Name string `json:"name"`
|
||||
Quantity int `json:"quantity"`
|
||||
}
|
||||
|
||||
func buildSpecification(result *models.AnalysisResult) []SpecLine {
|
||||
var spec []SpecLine
|
||||
hw := result.Hardware
|
||||
if hw == nil {
|
||||
return spec
|
||||
}
|
||||
|
||||
// CPUs - group by model
|
||||
cpuGroups := make(map[string]int)
|
||||
cpuDetails := make(map[string]models.CPU)
|
||||
for _, cpu := range hw.CPUs {
|
||||
cpuGroups[cpu.Model]++
|
||||
cpuDetails[cpu.Model] = cpu
|
||||
}
|
||||
for model, count := range cpuGroups {
|
||||
cpu := cpuDetails[model]
|
||||
name := fmt.Sprintf("Intel %s (%.1fGHz %dC %dW)",
|
||||
model,
|
||||
float64(cpu.FrequencyMHz)/1000,
|
||||
cpu.Cores,
|
||||
cpu.TDP)
|
||||
spec = append(spec, SpecLine{Category: "Процессор", Name: name, Quantity: count})
|
||||
}
|
||||
|
||||
// Memory - group by size and type
|
||||
memGroups := make(map[string]int)
|
||||
for _, mem := range hw.Memory {
|
||||
key := fmt.Sprintf("%s %dGB", mem.Type, mem.SizeMB/1024)
|
||||
memGroups[key]++
|
||||
}
|
||||
for key, count := range memGroups {
|
||||
spec = append(spec, SpecLine{Category: "Память", Name: key, Quantity: count})
|
||||
}
|
||||
|
||||
// Storage - group by type and capacity
|
||||
storGroups := make(map[string]int)
|
||||
for _, stor := range hw.Storage {
|
||||
var key string
|
||||
if stor.SizeGB >= 1000 {
|
||||
key = fmt.Sprintf("%s %s %.2fTB", stor.Type, stor.Interface, float64(stor.SizeGB)/1000)
|
||||
} else {
|
||||
key = fmt.Sprintf("%s %s %dGB", stor.Type, stor.Interface, stor.SizeGB)
|
||||
}
|
||||
storGroups[key]++
|
||||
}
|
||||
for key, count := range storGroups {
|
||||
spec = append(spec, SpecLine{Category: "Накопитель", Name: key, Quantity: count})
|
||||
}
|
||||
|
||||
// PCIe devices - group by device class/name and manufacturer
|
||||
pcieGroups := make(map[string]int)
|
||||
pcieDetails := make(map[string]models.PCIeDevice)
|
||||
for _, pcie := range hw.PCIeDevices {
|
||||
// Create unique key from manufacturer + device class/name
|
||||
key := pcie.DeviceClass
|
||||
if pcie.Manufacturer != "" {
|
||||
key = pcie.Manufacturer + " " + pcie.DeviceClass
|
||||
}
|
||||
if pcie.PartNumber != "" && pcie.PartNumber != pcie.DeviceClass {
|
||||
key = key + " (" + pcie.PartNumber + ")"
|
||||
}
|
||||
pcieGroups[key]++
|
||||
pcieDetails[key] = pcie
|
||||
}
|
||||
for key, count := range pcieGroups {
|
||||
pcie := pcieDetails[key]
|
||||
category := "PCIe устройство"
|
||||
name := key
|
||||
|
||||
// Determine category based on device class or known GPU names
|
||||
deviceClass := pcie.DeviceClass
|
||||
isGPU := isGPUDevice(deviceClass)
|
||||
isNetwork := deviceClass == "Network" || strings.Contains(deviceClass, "ConnectX")
|
||||
|
||||
if isGPU {
|
||||
category = "Графический процессор"
|
||||
} else if isNetwork {
|
||||
category = "Сетевой адаптер"
|
||||
} else if deviceClass == "NVMe" || deviceClass == "RAID" || deviceClass == "SAS" || deviceClass == "SATA" || deviceClass == "Storage" {
|
||||
category = "Контроллер"
|
||||
}
|
||||
|
||||
spec = append(spec, SpecLine{Category: category, Name: name, Quantity: count})
|
||||
}
|
||||
|
||||
// Power supplies - group by model/wattage
|
||||
psuGroups := make(map[string]int)
|
||||
for _, psu := range hw.PowerSupply {
|
||||
key := psu.Model
|
||||
if key == "" && psu.WattageW > 0 {
|
||||
key = fmt.Sprintf("%dW", psu.WattageW)
|
||||
}
|
||||
if key != "" {
|
||||
psuGroups[key]++
|
||||
}
|
||||
}
|
||||
for key, count := range psuGroups {
|
||||
spec = append(spec, SpecLine{Category: "Блок питания", Name: key, Quantity: count})
|
||||
}
|
||||
|
||||
return spec
|
||||
}
|
||||
|
||||
func (s *Server) handleGetSerials(w http.ResponseWriter, r *http.Request) {
|
||||
result := s.GetResult()
|
||||
if result == nil {
|
||||
jsonResponse(w, []interface{}{})
|
||||
return
|
||||
}
|
||||
|
||||
// Collect all serial numbers from various sources
|
||||
type SerialEntry struct {
|
||||
Component string `json:"component"`
|
||||
SerialNumber string `json:"serial_number"`
|
||||
Manufacturer string `json:"manufacturer,omitempty"`
|
||||
PartNumber string `json:"part_number,omitempty"`
|
||||
Category string `json:"category"`
|
||||
}
|
||||
|
||||
var serials []SerialEntry
|
||||
|
||||
// From FRU
|
||||
for _, fru := range result.FRU {
|
||||
if fru.SerialNumber == "" {
|
||||
continue
|
||||
}
|
||||
name := fru.ProductName
|
||||
if name == "" {
|
||||
name = fru.Description
|
||||
}
|
||||
serials = append(serials, SerialEntry{
|
||||
Component: name,
|
||||
SerialNumber: fru.SerialNumber,
|
||||
Manufacturer: fru.Manufacturer,
|
||||
PartNumber: fru.PartNumber,
|
||||
Category: "FRU",
|
||||
})
|
||||
}
|
||||
|
||||
// From Hardware
|
||||
if result.Hardware != nil {
|
||||
// Board
|
||||
if result.Hardware.BoardInfo.SerialNumber != "" {
|
||||
serials = append(serials, SerialEntry{
|
||||
Component: result.Hardware.BoardInfo.ProductName,
|
||||
SerialNumber: result.Hardware.BoardInfo.SerialNumber,
|
||||
Manufacturer: result.Hardware.BoardInfo.Manufacturer,
|
||||
PartNumber: result.Hardware.BoardInfo.PartNumber,
|
||||
Category: "Board",
|
||||
})
|
||||
}
|
||||
|
||||
// CPUs
|
||||
for _, cpu := range result.Hardware.CPUs {
|
||||
sn := cpu.SerialNumber
|
||||
if sn == "" {
|
||||
sn = cpu.PPIN // Use PPIN as fallback identifier
|
||||
}
|
||||
if sn == "" {
|
||||
continue
|
||||
}
|
||||
serials = append(serials, SerialEntry{
|
||||
Component: cpu.Model,
|
||||
SerialNumber: sn,
|
||||
Category: "CPU",
|
||||
})
|
||||
}
|
||||
|
||||
// Memory DIMMs
|
||||
for _, mem := range result.Hardware.Memory {
|
||||
if mem.SerialNumber == "" {
|
||||
continue
|
||||
}
|
||||
serials = append(serials, SerialEntry{
|
||||
Component: mem.PartNumber,
|
||||
SerialNumber: mem.SerialNumber,
|
||||
Manufacturer: mem.Manufacturer,
|
||||
PartNumber: mem.PartNumber,
|
||||
Category: "Memory",
|
||||
})
|
||||
}
|
||||
|
||||
// Storage
|
||||
for _, stor := range result.Hardware.Storage {
|
||||
if stor.SerialNumber == "" {
|
||||
continue
|
||||
}
|
||||
serials = append(serials, SerialEntry{
|
||||
Component: stor.Model,
|
||||
SerialNumber: stor.SerialNumber,
|
||||
Manufacturer: stor.Manufacturer,
|
||||
PartNumber: stor.Slot,
|
||||
Category: "Storage",
|
||||
})
|
||||
}
|
||||
|
||||
// PCIe devices
|
||||
for _, pcie := range result.Hardware.PCIeDevices {
|
||||
if pcie.SerialNumber == "" {
|
||||
continue
|
||||
}
|
||||
serials = append(serials, SerialEntry{
|
||||
Component: pcie.DeviceClass + " (" + pcie.Slot + ")",
|
||||
SerialNumber: pcie.SerialNumber,
|
||||
Manufacturer: pcie.Manufacturer,
|
||||
PartNumber: pcie.PartNumber,
|
||||
Category: "PCIe",
|
||||
})
|
||||
}
|
||||
|
||||
// Network cards
|
||||
for _, nic := range result.Hardware.NetworkCards {
|
||||
if nic.SerialNumber == "" {
|
||||
continue
|
||||
}
|
||||
serials = append(serials, SerialEntry{
|
||||
Component: nic.Model,
|
||||
SerialNumber: nic.SerialNumber,
|
||||
Category: "Network",
|
||||
})
|
||||
}
|
||||
|
||||
// Power supplies
|
||||
for _, psu := range result.Hardware.PowerSupply {
|
||||
if psu.SerialNumber == "" {
|
||||
continue
|
||||
}
|
||||
serials = append(serials, SerialEntry{
|
||||
Component: psu.Model,
|
||||
SerialNumber: psu.SerialNumber,
|
||||
PartNumber: psu.Slot,
|
||||
Category: "PSU",
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
jsonResponse(w, serials)
|
||||
}
|
||||
|
||||
func (s *Server) handleGetFirmware(w http.ResponseWriter, r *http.Request) {
|
||||
result := s.GetResult()
|
||||
if result == nil || result.Hardware == nil {
|
||||
jsonResponse(w, []interface{}{})
|
||||
return
|
||||
}
|
||||
jsonResponse(w, result.Hardware.Firmware)
|
||||
}
|
||||
|
||||
func (s *Server) handleGetStatus(w http.ResponseWriter, r *http.Request) {
|
||||
result := s.GetResult()
|
||||
if result == nil {
|
||||
jsonResponse(w, map[string]interface{}{
|
||||
"loaded": false,
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
jsonResponse(w, map[string]interface{}{
|
||||
"loaded": true,
|
||||
"filename": result.Filename,
|
||||
"vendor": s.GetDetectedVendor(),
|
||||
"stats": map[string]int{
|
||||
"events": len(result.Events),
|
||||
"sensors": len(result.Sensors),
|
||||
"fru": len(result.FRU),
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
func (s *Server) handleExportCSV(w http.ResponseWriter, r *http.Request) {
|
||||
result := s.GetResult()
|
||||
|
||||
w.Header().Set("Content-Type", "text/csv; charset=utf-8")
|
||||
w.Header().Set("Content-Disposition", "attachment; filename=serials.csv")
|
||||
|
||||
exp := exporter.New(result)
|
||||
exp.ExportCSV(w)
|
||||
}
|
||||
|
||||
func (s *Server) handleExportJSON(w http.ResponseWriter, r *http.Request) {
|
||||
result := s.GetResult()
|
||||
|
||||
w.Header().Set("Content-Type", "application/json")
|
||||
w.Header().Set("Content-Disposition", "attachment; filename=report.json")
|
||||
|
||||
exp := exporter.New(result)
|
||||
exp.ExportJSON(w)
|
||||
}
|
||||
|
||||
func (s *Server) handleExportTXT(w http.ResponseWriter, r *http.Request) {
|
||||
result := s.GetResult()
|
||||
|
||||
w.Header().Set("Content-Type", "text/plain; charset=utf-8")
|
||||
w.Header().Set("Content-Disposition", "attachment; filename=report.txt")
|
||||
|
||||
exp := exporter.New(result)
|
||||
exp.ExportTXT(w)
|
||||
}
|
||||
|
||||
func (s *Server) handleClear(w http.ResponseWriter, r *http.Request) {
|
||||
s.SetResult(nil)
|
||||
s.SetDetectedVendor("")
|
||||
jsonResponse(w, map[string]string{
|
||||
"status": "ok",
|
||||
"message": "Data cleared",
|
||||
})
|
||||
}
|
||||
|
||||
func jsonResponse(w http.ResponseWriter, data interface{}) {
|
||||
w.Header().Set("Content-Type", "application/json")
|
||||
json.NewEncoder(w).Encode(data)
|
||||
}
|
||||
|
||||
func jsonError(w http.ResponseWriter, message string, code int) {
|
||||
w.Header().Set("Content-Type", "application/json")
|
||||
w.WriteHeader(code)
|
||||
json.NewEncoder(w).Encode(map[string]string{"error": message})
|
||||
}
|
||||
|
||||
// isGPUDevice checks if device class indicates a GPU
|
||||
func isGPUDevice(deviceClass string) bool {
|
||||
// Standard PCI class names
|
||||
if deviceClass == "VGA" || deviceClass == "3D Controller" || deviceClass == "Display" {
|
||||
return true
|
||||
}
|
||||
// Known GPU model patterns
|
||||
gpuPatterns := []string{
|
||||
"L40", "A100", "A10", "A16", "A30", "H100", "H200", "V100",
|
||||
"RTX", "GTX", "Quadro", "Tesla",
|
||||
"Instinct", "Radeon",
|
||||
"AST2500", "AST2600", // ASPEED BMC VGA
|
||||
}
|
||||
upperClass := strings.ToUpper(deviceClass)
|
||||
for _, pattern := range gpuPatterns {
|
||||
if strings.Contains(upperClass, strings.ToUpper(pattern)) {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
96
internal/server/server.go
Normal file
96
internal/server/server.go
Normal file
@@ -0,0 +1,96 @@
|
||||
package server
|
||||
|
||||
import (
|
||||
"embed"
|
||||
"fmt"
|
||||
"io/fs"
|
||||
"net/http"
|
||||
"sync"
|
||||
|
||||
"git.mchus.pro/mchus/logpile/internal/models"
|
||||
)
|
||||
|
||||
// WebFS holds embedded web files (set by main package)
|
||||
var WebFS embed.FS
|
||||
|
||||
type Config struct {
|
||||
Port int
|
||||
PreloadFile string
|
||||
}
|
||||
|
||||
type Server struct {
|
||||
config Config
|
||||
mux *http.ServeMux
|
||||
|
||||
mu sync.RWMutex
|
||||
result *models.AnalysisResult
|
||||
detectedVendor string
|
||||
}
|
||||
|
||||
func New(cfg Config) *Server {
|
||||
s := &Server{
|
||||
config: cfg,
|
||||
mux: http.NewServeMux(),
|
||||
}
|
||||
s.setupRoutes()
|
||||
return s
|
||||
}
|
||||
|
||||
func (s *Server) setupRoutes() {
|
||||
// Static files
|
||||
staticContent, err := fs.Sub(WebFS, "static")
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
s.mux.Handle("/static/", http.StripPrefix("/static/", http.FileServer(http.FS(staticContent))))
|
||||
|
||||
// Pages
|
||||
s.mux.HandleFunc("/", s.handleIndex)
|
||||
|
||||
// API endpoints
|
||||
s.mux.HandleFunc("POST /api/upload", s.handleUpload)
|
||||
s.mux.HandleFunc("GET /api/status", s.handleGetStatus)
|
||||
s.mux.HandleFunc("GET /api/parsers", s.handleGetParsers)
|
||||
s.mux.HandleFunc("GET /api/events", s.handleGetEvents)
|
||||
s.mux.HandleFunc("GET /api/sensors", s.handleGetSensors)
|
||||
s.mux.HandleFunc("GET /api/config", s.handleGetConfig)
|
||||
s.mux.HandleFunc("GET /api/serials", s.handleGetSerials)
|
||||
s.mux.HandleFunc("GET /api/firmware", s.handleGetFirmware)
|
||||
s.mux.HandleFunc("GET /api/export/csv", s.handleExportCSV)
|
||||
s.mux.HandleFunc("GET /api/export/json", s.handleExportJSON)
|
||||
s.mux.HandleFunc("GET /api/export/txt", s.handleExportTXT)
|
||||
s.mux.HandleFunc("DELETE /api/clear", s.handleClear)
|
||||
}
|
||||
|
||||
func (s *Server) Run() error {
|
||||
addr := fmt.Sprintf(":%d", s.config.Port)
|
||||
return http.ListenAndServe(addr, s.mux)
|
||||
}
|
||||
|
||||
// SetResult sets the analysis result (thread-safe)
|
||||
func (s *Server) SetResult(result *models.AnalysisResult) {
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
s.result = result
|
||||
}
|
||||
|
||||
// GetResult returns the analysis result (thread-safe)
|
||||
func (s *Server) GetResult() *models.AnalysisResult {
|
||||
s.mu.RLock()
|
||||
defer s.mu.RUnlock()
|
||||
return s.result
|
||||
}
|
||||
|
||||
// SetDetectedVendor sets the detected vendor name
|
||||
func (s *Server) SetDetectedVendor(vendor string) {
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
s.detectedVendor = vendor
|
||||
}
|
||||
|
||||
// GetDetectedVendor returns the detected vendor name
|
||||
func (s *Server) GetDetectedVendor() string {
|
||||
s.mu.RLock()
|
||||
defer s.mu.RUnlock()
|
||||
return s.detectedVendor
|
||||
}
|
||||
6
web/embed.go
Normal file
6
web/embed.go
Normal file
@@ -0,0 +1,6 @@
|
||||
package web
|
||||
|
||||
import "embed"
|
||||
|
||||
//go:embed templates static
|
||||
var FS embed.FS
|
||||
405
web/static/css/style.css
Normal file
405
web/static/css/style.css
Normal file
@@ -0,0 +1,405 @@
|
||||
* {
|
||||
box-sizing: border-box;
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
body {
|
||||
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif;
|
||||
background: #f5f5f5;
|
||||
color: #333;
|
||||
line-height: 1.6;
|
||||
}
|
||||
|
||||
header {
|
||||
background: #2c3e50;
|
||||
color: white;
|
||||
padding: 1rem 2rem;
|
||||
}
|
||||
|
||||
header h1 {
|
||||
font-size: 1.5rem;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
header p {
|
||||
font-size: 0.875rem;
|
||||
opacity: 0.8;
|
||||
}
|
||||
|
||||
main {
|
||||
max-width: 1400px;
|
||||
margin: 2rem auto;
|
||||
padding: 0 1rem;
|
||||
}
|
||||
|
||||
/* Upload section */
|
||||
.upload-area {
|
||||
border: 2px dashed #ccc;
|
||||
border-radius: 8px;
|
||||
padding: 3rem;
|
||||
text-align: center;
|
||||
background: white;
|
||||
transition: border-color 0.3s, background 0.3s;
|
||||
}
|
||||
|
||||
.upload-area.dragover {
|
||||
border-color: #3498db;
|
||||
background: #ebf5fb;
|
||||
}
|
||||
|
||||
.upload-area button {
|
||||
background: #3498db;
|
||||
color: white;
|
||||
border: none;
|
||||
padding: 0.75rem 1.5rem;
|
||||
border-radius: 4px;
|
||||
cursor: pointer;
|
||||
font-size: 1rem;
|
||||
margin: 1rem 0;
|
||||
}
|
||||
|
||||
.upload-area button:hover {
|
||||
background: #2980b9;
|
||||
}
|
||||
|
||||
.upload-area .hint {
|
||||
font-size: 0.8rem;
|
||||
color: #888;
|
||||
}
|
||||
|
||||
#upload-status {
|
||||
margin-top: 1rem;
|
||||
text-align: center;
|
||||
padding: 0.5rem;
|
||||
}
|
||||
|
||||
#upload-status.error {
|
||||
color: #e74c3c;
|
||||
}
|
||||
|
||||
#upload-status.success {
|
||||
color: #27ae60;
|
||||
}
|
||||
|
||||
/* Tabs */
|
||||
.tabs {
|
||||
display: flex;
|
||||
border-bottom: 2px solid #ddd;
|
||||
margin-bottom: 1rem;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.tab {
|
||||
padding: 0.75rem 1.5rem;
|
||||
background: none;
|
||||
border: none;
|
||||
cursor: pointer;
|
||||
font-size: 1rem;
|
||||
color: #666;
|
||||
border-bottom: 2px solid transparent;
|
||||
margin-bottom: -2px;
|
||||
}
|
||||
|
||||
.tab:hover {
|
||||
color: #333;
|
||||
}
|
||||
|
||||
.tab.active {
|
||||
color: #3498db;
|
||||
border-bottom-color: #3498db;
|
||||
}
|
||||
|
||||
.tab-content {
|
||||
display: none;
|
||||
background: white;
|
||||
border-radius: 8px;
|
||||
padding: 1rem;
|
||||
}
|
||||
|
||||
.tab-content.active {
|
||||
display: block;
|
||||
}
|
||||
|
||||
/* Toolbar */
|
||||
.toolbar {
|
||||
display: flex;
|
||||
gap: 1rem;
|
||||
margin-bottom: 1rem;
|
||||
align-items: center;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.toolbar select,
|
||||
.toolbar button {
|
||||
padding: 0.5rem 1rem;
|
||||
border-radius: 4px;
|
||||
font-size: 0.875rem;
|
||||
}
|
||||
|
||||
.toolbar select {
|
||||
border: 1px solid #ddd;
|
||||
}
|
||||
|
||||
.toolbar button {
|
||||
background: #27ae60;
|
||||
color: white;
|
||||
border: none;
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.toolbar button:hover {
|
||||
background: #219a52;
|
||||
}
|
||||
|
||||
/* Tables */
|
||||
table {
|
||||
width: 100%;
|
||||
border-collapse: collapse;
|
||||
}
|
||||
|
||||
th, td {
|
||||
padding: 0.75rem;
|
||||
text-align: left;
|
||||
border-bottom: 1px solid #eee;
|
||||
}
|
||||
|
||||
th {
|
||||
background: #f8f9fa;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
tr:hover {
|
||||
background: #f8f9fa;
|
||||
}
|
||||
|
||||
code {
|
||||
font-family: 'Monaco', 'Menlo', monospace;
|
||||
font-size: 0.85em;
|
||||
background: #f0f0f0;
|
||||
padding: 0.1em 0.3em;
|
||||
border-radius: 3px;
|
||||
}
|
||||
|
||||
/* Severity badges */
|
||||
.severity {
|
||||
padding: 0.25rem 0.5rem;
|
||||
border-radius: 4px;
|
||||
font-size: 0.75rem;
|
||||
font-weight: 600;
|
||||
text-transform: uppercase;
|
||||
}
|
||||
|
||||
.severity.critical {
|
||||
background: #e74c3c;
|
||||
color: white;
|
||||
}
|
||||
|
||||
.severity.warning {
|
||||
background: #f39c12;
|
||||
color: white;
|
||||
}
|
||||
|
||||
.severity.info {
|
||||
background: #3498db;
|
||||
color: white;
|
||||
}
|
||||
|
||||
/* Category badges */
|
||||
.category-badge {
|
||||
display: inline-block;
|
||||
padding: 0.2rem 0.5rem;
|
||||
border-radius: 4px;
|
||||
font-size: 0.75rem;
|
||||
font-weight: 500;
|
||||
background: #95a5a6;
|
||||
color: white;
|
||||
}
|
||||
|
||||
.category-badge.board {
|
||||
background: #2c3e50;
|
||||
}
|
||||
|
||||
.category-badge.cpu {
|
||||
background: #e74c3c;
|
||||
}
|
||||
|
||||
.category-badge.memory {
|
||||
background: #3498db;
|
||||
}
|
||||
|
||||
.category-badge.storage {
|
||||
background: #27ae60;
|
||||
}
|
||||
|
||||
.category-badge.pcie {
|
||||
background: #e67e22;
|
||||
}
|
||||
|
||||
.category-badge.network {
|
||||
background: #1abc9c;
|
||||
}
|
||||
|
||||
.category-badge.psu {
|
||||
background: #f39c12;
|
||||
}
|
||||
|
||||
.category-badge.fru {
|
||||
background: #9b59b6;
|
||||
}
|
||||
|
||||
/* Toolbar label */
|
||||
.toolbar-label {
|
||||
font-size: 0.875rem;
|
||||
color: #666;
|
||||
}
|
||||
|
||||
/* Config sections */
|
||||
.config-section {
|
||||
margin-bottom: 1.5rem;
|
||||
}
|
||||
|
||||
.config-section h3 {
|
||||
font-size: 1rem;
|
||||
color: #2c3e50;
|
||||
margin-bottom: 0.5rem;
|
||||
border-bottom: 1px solid #eee;
|
||||
padding-bottom: 0.25rem;
|
||||
}
|
||||
|
||||
/* Specification section */
|
||||
.spec-section {
|
||||
background: #f8f9fa;
|
||||
border-radius: 8px;
|
||||
padding: 1rem;
|
||||
border-left: 4px solid #3498db;
|
||||
}
|
||||
|
||||
.spec-list {
|
||||
list-style: none;
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
.spec-list li {
|
||||
padding: 0.4rem 0;
|
||||
border-bottom: 1px dashed #e0e0e0;
|
||||
}
|
||||
|
||||
.spec-list li:last-child {
|
||||
border-bottom: none;
|
||||
}
|
||||
|
||||
.spec-category {
|
||||
font-weight: 600;
|
||||
color: #2c3e50;
|
||||
min-width: 180px;
|
||||
display: inline-block;
|
||||
}
|
||||
|
||||
.spec-qty {
|
||||
color: #666;
|
||||
font-weight: 500;
|
||||
}
|
||||
|
||||
.card {
|
||||
background: #f8f9fa;
|
||||
padding: 1rem;
|
||||
border-radius: 4px;
|
||||
margin-bottom: 0.5rem;
|
||||
}
|
||||
|
||||
/* Sensors */
|
||||
.sensor-group {
|
||||
margin-bottom: 1.5rem;
|
||||
}
|
||||
|
||||
.sensor-group h3 {
|
||||
font-size: 1rem;
|
||||
color: #2c3e50;
|
||||
margin-bottom: 0.75rem;
|
||||
}
|
||||
|
||||
.sensor-grid {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(auto-fill, minmax(180px, 1fr));
|
||||
gap: 0.5rem;
|
||||
}
|
||||
|
||||
.sensor-card {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
padding: 0.5rem 0.75rem;
|
||||
border-radius: 4px;
|
||||
background: #f8f9fa;
|
||||
border-left: 3px solid #27ae60;
|
||||
}
|
||||
|
||||
.sensor-card.ok {
|
||||
border-left-color: #27ae60;
|
||||
}
|
||||
|
||||
.sensor-card.warn {
|
||||
border-left-color: #f39c12;
|
||||
}
|
||||
|
||||
.sensor-card.ns {
|
||||
border-left-color: #95a5a6;
|
||||
opacity: 0.6;
|
||||
}
|
||||
|
||||
.sensor-name {
|
||||
font-size: 0.75rem;
|
||||
color: #666;
|
||||
}
|
||||
|
||||
.sensor-value {
|
||||
font-size: 1rem;
|
||||
font-weight: 600;
|
||||
color: #2c3e50;
|
||||
}
|
||||
|
||||
/* Footer */
|
||||
footer {
|
||||
text-align: center;
|
||||
padding: 2rem;
|
||||
}
|
||||
|
||||
#clear-btn {
|
||||
background: #e74c3c;
|
||||
color: white;
|
||||
border: none;
|
||||
padding: 0.5rem 1rem;
|
||||
border-radius: 4px;
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
#clear-btn:hover {
|
||||
background: #c0392b;
|
||||
}
|
||||
|
||||
/* Utility */
|
||||
.hidden {
|
||||
display: none !important;
|
||||
}
|
||||
|
||||
.no-data {
|
||||
text-align: center;
|
||||
color: #888;
|
||||
padding: 2rem;
|
||||
}
|
||||
|
||||
/* Responsive */
|
||||
@media (max-width: 768px) {
|
||||
.sensor-grid {
|
||||
grid-template-columns: repeat(2, 1fr);
|
||||
}
|
||||
|
||||
table {
|
||||
font-size: 0.875rem;
|
||||
}
|
||||
|
||||
th, td {
|
||||
padding: 0.5rem;
|
||||
}
|
||||
}
|
||||
443
web/static/js/app.js
Normal file
443
web/static/js/app.js
Normal file
@@ -0,0 +1,443 @@
|
||||
// LOGPile Frontend Application
|
||||
|
||||
document.addEventListener('DOMContentLoaded', () => {
|
||||
initUpload();
|
||||
initTabs();
|
||||
initFilters();
|
||||
});
|
||||
|
||||
// Upload handling
|
||||
function initUpload() {
|
||||
const dropZone = document.getElementById('drop-zone');
|
||||
const fileInput = document.getElementById('file-input');
|
||||
|
||||
dropZone.addEventListener('dragover', (e) => {
|
||||
e.preventDefault();
|
||||
dropZone.classList.add('dragover');
|
||||
});
|
||||
|
||||
dropZone.addEventListener('dragleave', () => {
|
||||
dropZone.classList.remove('dragover');
|
||||
});
|
||||
|
||||
dropZone.addEventListener('drop', (e) => {
|
||||
e.preventDefault();
|
||||
dropZone.classList.remove('dragover');
|
||||
const files = e.dataTransfer.files;
|
||||
if (files.length > 0) {
|
||||
uploadFile(files[0]);
|
||||
}
|
||||
});
|
||||
|
||||
fileInput.addEventListener('change', () => {
|
||||
if (fileInput.files.length > 0) {
|
||||
uploadFile(fileInput.files[0]);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async function uploadFile(file) {
|
||||
const status = document.getElementById('upload-status');
|
||||
status.textContent = 'Загрузка и анализ...';
|
||||
status.className = '';
|
||||
|
||||
const formData = new FormData();
|
||||
formData.append('archive', file);
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/upload', {
|
||||
method: 'POST',
|
||||
body: formData
|
||||
});
|
||||
|
||||
const result = await response.json();
|
||||
|
||||
if (response.ok) {
|
||||
status.innerHTML = `<strong>${escapeHtml(result.vendor)}</strong><br>` +
|
||||
`${result.stats.sensors} сенсоров, ${result.stats.fru} компонентов, ${result.stats.events} событий`;
|
||||
status.className = 'success';
|
||||
loadData(result.vendor);
|
||||
} else {
|
||||
status.textContent = result.error || 'Ошибка загрузки';
|
||||
status.className = 'error';
|
||||
}
|
||||
} catch (err) {
|
||||
status.textContent = 'Ошибка соединения';
|
||||
status.className = 'error';
|
||||
}
|
||||
}
|
||||
|
||||
// Tab navigation
|
||||
function initTabs() {
|
||||
const tabs = document.querySelectorAll('.tab');
|
||||
tabs.forEach(tab => {
|
||||
tab.addEventListener('click', () => {
|
||||
tabs.forEach(t => t.classList.remove('active'));
|
||||
document.querySelectorAll('.tab-content').forEach(c => c.classList.remove('active'));
|
||||
tab.classList.add('active');
|
||||
document.getElementById(tab.dataset.tab).classList.add('active');
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
// Filters
|
||||
function initFilters() {
|
||||
document.getElementById('sensor-filter').addEventListener('change', (e) => {
|
||||
filterSensors(e.target.value);
|
||||
});
|
||||
document.getElementById('severity-filter').addEventListener('change', (e) => {
|
||||
filterEvents(e.target.value);
|
||||
});
|
||||
document.getElementById('serial-filter').addEventListener('change', (e) => {
|
||||
filterSerials(e.target.value);
|
||||
});
|
||||
}
|
||||
|
||||
let allSensors = [];
|
||||
let allEvents = [];
|
||||
let allSerials = [];
|
||||
|
||||
let currentVendor = '';
|
||||
|
||||
// Load data from API
|
||||
async function loadData(vendor) {
|
||||
currentVendor = vendor || '';
|
||||
document.getElementById('upload-section').classList.add('hidden');
|
||||
document.getElementById('data-section').classList.remove('hidden');
|
||||
document.getElementById('clear-btn').classList.remove('hidden');
|
||||
|
||||
// Update vendor badge if exists
|
||||
const vendorBadge = document.getElementById('vendor-badge');
|
||||
if (vendorBadge && currentVendor) {
|
||||
vendorBadge.textContent = currentVendor;
|
||||
vendorBadge.classList.remove('hidden');
|
||||
}
|
||||
|
||||
await Promise.all([
|
||||
loadConfig(),
|
||||
loadFirmware(),
|
||||
loadSensors(),
|
||||
loadSerials(),
|
||||
loadEvents()
|
||||
]);
|
||||
}
|
||||
|
||||
async function loadConfig() {
|
||||
try {
|
||||
const response = await fetch('/api/config');
|
||||
const config = await response.json();
|
||||
renderConfig(config);
|
||||
} catch (err) {
|
||||
console.error('Failed to load config:', err);
|
||||
}
|
||||
}
|
||||
|
||||
function renderConfig(data) {
|
||||
const container = document.getElementById('config-content');
|
||||
|
||||
if (!data || Object.keys(data).length === 0) {
|
||||
container.innerHTML = '<p class="no-data">Нет данных о конфигурации</p>';
|
||||
return;
|
||||
}
|
||||
|
||||
// Handle both old format (direct hardware) and new format (with specification)
|
||||
const config = data.hardware || data;
|
||||
const spec = data.specification;
|
||||
|
||||
let html = '';
|
||||
|
||||
// Specification summary (new section)
|
||||
if (spec && spec.length > 0) {
|
||||
html += '<div class="config-section spec-section"><h3>Спецификация сервера</h3><ul class="spec-list">';
|
||||
spec.forEach(item => {
|
||||
html += `<li><span class="spec-category">${escapeHtml(item.category)}</span> ${escapeHtml(item.name)} <span class="spec-qty">- ${item.quantity} шт.</span></li>`;
|
||||
});
|
||||
html += '</ul></div>';
|
||||
}
|
||||
|
||||
// CPUs
|
||||
if (config.cpus && config.cpus.length > 0) {
|
||||
html += '<div class="config-section"><h3>Процессоры</h3>';
|
||||
config.cpus.forEach(cpu => {
|
||||
html += `<div class="card">
|
||||
<strong>Socket ${cpu.socket}: ${escapeHtml(cpu.model)}</strong><br>
|
||||
Ядра: ${cpu.cores}, Потоки: ${cpu.threads}<br>
|
||||
Частота: ${cpu.frequency_mhz} MHz (Turbo: ${cpu.max_frequency_mhz} MHz)<br>
|
||||
TDP: ${cpu.tdp_w}W, L3: ${Math.round(cpu.l3_cache_kb/1024)} MB
|
||||
</div>`;
|
||||
});
|
||||
html += '</div>';
|
||||
}
|
||||
|
||||
// Memory summary
|
||||
if (config.memory && config.memory.length > 0) {
|
||||
const totalGB = config.memory.reduce((sum, m) => sum + m.size_mb, 0) / 1024;
|
||||
html += `<div class="config-section"><h3>Память</h3>
|
||||
<p>Всего: ${totalGB} GB (${config.memory.length} модулей ${config.memory[0].type} @ ${config.memory[0].speed_mhz} MHz)</p>
|
||||
<p>Производитель: ${escapeHtml(config.memory[0].manufacturer)}</p>
|
||||
</div>`;
|
||||
}
|
||||
|
||||
// Storage summary
|
||||
if (config.storage && config.storage.length > 0) {
|
||||
html += '<div class="config-section"><h3>Накопители</h3><table><thead><tr><th>Слот</th><th>Модель</th><th>Размер</th></tr></thead><tbody>';
|
||||
config.storage.forEach(s => {
|
||||
html += `<tr>
|
||||
<td>${escapeHtml(s.slot || s.interface)}</td>
|
||||
<td>${escapeHtml(s.model)}</td>
|
||||
<td>${s.size_gb} GB</td>
|
||||
</tr>`;
|
||||
});
|
||||
html += '</tbody></table></div>';
|
||||
}
|
||||
|
||||
// PCIe summary
|
||||
if (config.pcie_devices && config.pcie_devices.length > 0) {
|
||||
html += '<div class="config-section"><h3>PCIe устройства</h3><table><thead><tr><th>Слот</th><th>Тип</th><th>Производитель</th><th>Скорость</th></tr></thead><tbody>';
|
||||
config.pcie_devices.forEach(p => {
|
||||
html += `<tr>
|
||||
<td>${escapeHtml(p.slot)}</td>
|
||||
<td>${escapeHtml(p.device_class)}</td>
|
||||
<td>${escapeHtml(p.manufacturer || '-')}</td>
|
||||
<td>x${p.link_width} ${escapeHtml(p.link_speed)}</td>
|
||||
</tr>`;
|
||||
});
|
||||
html += '</tbody></table></div>';
|
||||
}
|
||||
|
||||
container.innerHTML = html;
|
||||
}
|
||||
|
||||
async function loadFirmware() {
|
||||
try {
|
||||
const response = await fetch('/api/firmware');
|
||||
const firmware = await response.json();
|
||||
renderFirmware(firmware);
|
||||
} catch (err) {
|
||||
console.error('Failed to load firmware:', err);
|
||||
}
|
||||
}
|
||||
|
||||
function renderFirmware(firmware) {
|
||||
const tbody = document.querySelector('#firmware-table tbody');
|
||||
tbody.innerHTML = '';
|
||||
|
||||
if (!firmware || firmware.length === 0) {
|
||||
tbody.innerHTML = '<tr><td colspan="3" class="no-data">Нет данных о прошивках</td></tr>';
|
||||
return;
|
||||
}
|
||||
|
||||
firmware.forEach(fw => {
|
||||
const row = document.createElement('tr');
|
||||
row.innerHTML = `
|
||||
<td>${escapeHtml(fw.device_name)}</td>
|
||||
<td><code>${escapeHtml(fw.version)}</code></td>
|
||||
<td>${escapeHtml(fw.build_date || '-')}</td>
|
||||
`;
|
||||
tbody.appendChild(row);
|
||||
});
|
||||
}
|
||||
|
||||
async function loadSensors() {
|
||||
try {
|
||||
const response = await fetch('/api/sensors');
|
||||
allSensors = await response.json();
|
||||
renderSensors(allSensors);
|
||||
} catch (err) {
|
||||
console.error('Failed to load sensors:', err);
|
||||
}
|
||||
}
|
||||
|
||||
function renderSensors(sensors) {
|
||||
const container = document.getElementById('sensors-content');
|
||||
|
||||
if (!sensors || sensors.length === 0) {
|
||||
container.innerHTML = '<p class="no-data">Нет данных о сенсорах</p>';
|
||||
return;
|
||||
}
|
||||
|
||||
// Group by type
|
||||
const byType = {};
|
||||
sensors.forEach(s => {
|
||||
if (!byType[s.type]) byType[s.type] = [];
|
||||
byType[s.type].push(s);
|
||||
});
|
||||
|
||||
const typeNames = {
|
||||
temperature: 'Температура',
|
||||
voltage: 'Напряжение',
|
||||
power: 'Мощность',
|
||||
fan_speed: 'Вентиляторы',
|
||||
fan_status: 'Статус вентиляторов',
|
||||
psu_status: 'Статус БП',
|
||||
cpu_status: 'Статус CPU',
|
||||
storage_status: 'Статус накопителей',
|
||||
other: 'Прочее'
|
||||
};
|
||||
|
||||
let html = '';
|
||||
for (const [type, items] of Object.entries(byType)) {
|
||||
html += `<div class="sensor-group" data-type="${type}">
|
||||
<h3>${typeNames[type] || type}</h3>
|
||||
<div class="sensor-grid">`;
|
||||
|
||||
items.forEach(s => {
|
||||
let valueStr = '';
|
||||
let statusClass = s.status === 'ok' ? 'ok' : (s.status === 'ns' ? 'ns' : 'warn');
|
||||
|
||||
if (s.value) {
|
||||
valueStr = `${s.value} ${s.unit}`;
|
||||
} else if (s.raw_value) {
|
||||
valueStr = s.raw_value;
|
||||
} else {
|
||||
valueStr = s.status;
|
||||
}
|
||||
|
||||
html += `<div class="sensor-card ${statusClass}">
|
||||
<span class="sensor-name">${escapeHtml(s.name)}</span>
|
||||
<span class="sensor-value">${escapeHtml(valueStr)}</span>
|
||||
</div>`;
|
||||
});
|
||||
|
||||
html += '</div></div>';
|
||||
}
|
||||
|
||||
container.innerHTML = html;
|
||||
}
|
||||
|
||||
function filterSensors(type) {
|
||||
if (!type) {
|
||||
renderSensors(allSensors);
|
||||
return;
|
||||
}
|
||||
const filtered = allSensors.filter(s => s.type === type);
|
||||
renderSensors(filtered);
|
||||
}
|
||||
|
||||
async function loadSerials() {
|
||||
try {
|
||||
const response = await fetch('/api/serials');
|
||||
allSerials = await response.json();
|
||||
renderSerials(allSerials);
|
||||
} catch (err) {
|
||||
console.error('Failed to load serials:', err);
|
||||
}
|
||||
}
|
||||
|
||||
function renderSerials(serials) {
|
||||
const tbody = document.querySelector('#serials-table tbody');
|
||||
tbody.innerHTML = '';
|
||||
|
||||
if (!serials || serials.length === 0) {
|
||||
tbody.innerHTML = '<tr><td colspan="5" class="no-data">Нет серийных номеров</td></tr>';
|
||||
return;
|
||||
}
|
||||
|
||||
const categoryNames = {
|
||||
'Board': 'Мат. плата',
|
||||
'CPU': 'Процессор',
|
||||
'Memory': 'Память',
|
||||
'Storage': 'Накопитель',
|
||||
'PCIe': 'PCIe',
|
||||
'Network': 'Сеть',
|
||||
'PSU': 'БП',
|
||||
'FRU': 'FRU'
|
||||
};
|
||||
|
||||
serials.forEach(item => {
|
||||
if (!item.serial_number) return;
|
||||
const row = document.createElement('tr');
|
||||
row.innerHTML = `
|
||||
<td><span class="category-badge ${item.category.toLowerCase()}">${categoryNames[item.category] || item.category}</span></td>
|
||||
<td>${escapeHtml(item.component)}</td>
|
||||
<td><code>${escapeHtml(item.serial_number)}</code></td>
|
||||
<td>${escapeHtml(item.manufacturer || '-')}</td>
|
||||
<td>${escapeHtml(item.part_number || '-')}</td>
|
||||
`;
|
||||
tbody.appendChild(row);
|
||||
});
|
||||
}
|
||||
|
||||
function filterSerials(category) {
|
||||
if (!category) {
|
||||
renderSerials(allSerials);
|
||||
return;
|
||||
}
|
||||
const filtered = allSerials.filter(s => s.category === category);
|
||||
renderSerials(filtered);
|
||||
}
|
||||
|
||||
async function loadEvents() {
|
||||
try {
|
||||
const response = await fetch('/api/events');
|
||||
allEvents = await response.json();
|
||||
renderEvents(allEvents);
|
||||
} catch (err) {
|
||||
console.error('Failed to load events:', err);
|
||||
}
|
||||
}
|
||||
|
||||
function renderEvents(events) {
|
||||
const tbody = document.querySelector('#events-table tbody');
|
||||
tbody.innerHTML = '';
|
||||
|
||||
if (!events || events.length === 0) {
|
||||
tbody.innerHTML = '<tr><td colspan="4" class="no-data">Нет событий</td></tr>';
|
||||
return;
|
||||
}
|
||||
|
||||
events.forEach(event => {
|
||||
const row = document.createElement('tr');
|
||||
row.innerHTML = `
|
||||
<td>${formatDate(event.timestamp)}</td>
|
||||
<td>${escapeHtml(event.source)}</td>
|
||||
<td>${escapeHtml(event.description)}</td>
|
||||
<td><span class="severity ${event.severity}">${event.severity}</span></td>
|
||||
`;
|
||||
tbody.appendChild(row);
|
||||
});
|
||||
}
|
||||
|
||||
function filterEvents(severity) {
|
||||
if (!severity) {
|
||||
renderEvents(allEvents);
|
||||
return;
|
||||
}
|
||||
const filtered = allEvents.filter(e => e.severity === severity);
|
||||
renderEvents(filtered);
|
||||
}
|
||||
|
||||
// Export functions
|
||||
function exportData(format) {
|
||||
window.location.href = `/api/export/${format}`;
|
||||
}
|
||||
|
||||
// Clear data
|
||||
async function clearData() {
|
||||
try {
|
||||
await fetch('/api/clear', { method: 'DELETE' });
|
||||
document.getElementById('upload-section').classList.remove('hidden');
|
||||
document.getElementById('data-section').classList.add('hidden');
|
||||
document.getElementById('clear-btn').classList.add('hidden');
|
||||
document.getElementById('upload-status').textContent = '';
|
||||
allSensors = [];
|
||||
allEvents = [];
|
||||
allSerials = [];
|
||||
} catch (err) {
|
||||
console.error('Failed to clear data:', err);
|
||||
}
|
||||
}
|
||||
|
||||
// Utilities
|
||||
function formatDate(isoString) {
|
||||
if (!isoString) return '-';
|
||||
const date = new Date(isoString);
|
||||
return date.toLocaleString('ru-RU');
|
||||
}
|
||||
|
||||
function escapeHtml(text) {
|
||||
if (!text) return '';
|
||||
const div = document.createElement('div');
|
||||
div.textContent = text;
|
||||
return div.innerHTML;
|
||||
}
|
||||
131
web/templates/index.html
Normal file
131
web/templates/index.html
Normal file
@@ -0,0 +1,131 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="ru">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>LOGPile - BMC Log Analyzer</title>
|
||||
<link rel="stylesheet" href="/static/css/style.css">
|
||||
</head>
|
||||
<body>
|
||||
<header>
|
||||
<h1>LOGPile</h1>
|
||||
<p>Анализатор диагностических данных BMC/IPMI</p>
|
||||
</header>
|
||||
|
||||
<main>
|
||||
<section id="upload-section">
|
||||
<div class="upload-area" id="drop-zone">
|
||||
<p>Перетащите архив сюда или</p>
|
||||
<input type="file" id="file-input" accept="application/gzip,application/x-gzip,application/x-tar,application/zip" hidden>
|
||||
<button type="button" onclick="document.getElementById('file-input').click()">Выберите файл</button>
|
||||
<p class="hint">Поддерживаемые форматы: tar.gz, zip</p>
|
||||
</div>
|
||||
<div id="upload-status"></div>
|
||||
</section>
|
||||
|
||||
<section id="data-section" class="hidden">
|
||||
<nav class="tabs">
|
||||
<button class="tab active" data-tab="config">Конфигурация</button>
|
||||
<button class="tab" data-tab="firmware">Прошивки</button>
|
||||
<button class="tab" data-tab="sensors">Сенсоры</button>
|
||||
<button class="tab" data-tab="serials">Серийные номера</button>
|
||||
<button class="tab" data-tab="events">События</button>
|
||||
</nav>
|
||||
|
||||
<div class="tab-content active" id="config">
|
||||
<div class="toolbar">
|
||||
<button onclick="exportData('json')">Экспорт JSON</button>
|
||||
<button onclick="exportData('txt')">Экспорт TXT</button>
|
||||
</div>
|
||||
<div id="config-content"></div>
|
||||
</div>
|
||||
|
||||
<div class="tab-content" id="firmware">
|
||||
<div class="toolbar">
|
||||
<span class="toolbar-label">Версии прошивок компонентов</span>
|
||||
</div>
|
||||
<table id="firmware-table">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Компонент</th>
|
||||
<th>Версия</th>
|
||||
<th>Дата сборки</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody></tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<div class="tab-content" id="sensors">
|
||||
<div class="toolbar">
|
||||
<select id="sensor-filter">
|
||||
<option value="">Все сенсоры</option>
|
||||
<option value="temperature">Температура</option>
|
||||
<option value="voltage">Напряжение</option>
|
||||
<option value="power">Мощность</option>
|
||||
<option value="fan_speed">Вентиляторы</option>
|
||||
</select>
|
||||
</div>
|
||||
<div id="sensors-content"></div>
|
||||
</div>
|
||||
|
||||
<div class="tab-content" id="serials">
|
||||
<div class="toolbar">
|
||||
<select id="serial-filter">
|
||||
<option value="">Все компоненты</option>
|
||||
<option value="Board">Материнская плата</option>
|
||||
<option value="CPU">Процессоры</option>
|
||||
<option value="Memory">Память</option>
|
||||
<option value="Storage">Накопители</option>
|
||||
<option value="PCIe">PCIe устройства</option>
|
||||
<option value="Network">Сетевые адаптеры</option>
|
||||
<option value="PSU">Блоки питания</option>
|
||||
<option value="FRU">FRU</option>
|
||||
</select>
|
||||
<button onclick="exportData('csv')">Экспорт CSV</button>
|
||||
</div>
|
||||
<table id="serials-table">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Категория</th>
|
||||
<th>Компонент</th>
|
||||
<th>Серийный номер</th>
|
||||
<th>Производитель</th>
|
||||
<th>Part Number</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody></tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<div class="tab-content" id="events">
|
||||
<div class="toolbar">
|
||||
<select id="severity-filter">
|
||||
<option value="">Все события</option>
|
||||
<option value="critical">Критические</option>
|
||||
<option value="warning">Предупреждения</option>
|
||||
<option value="info">Информационные</option>
|
||||
</select>
|
||||
</div>
|
||||
<table id="events-table">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Время</th>
|
||||
<th>Источник</th>
|
||||
<th>Описание</th>
|
||||
<th>Важность</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody></tbody>
|
||||
</table>
|
||||
</div>
|
||||
</section>
|
||||
</main>
|
||||
|
||||
<footer>
|
||||
<button id="clear-btn" class="hidden" onclick="clearData()">Очистить данные</button>
|
||||
</footer>
|
||||
|
||||
<script src="/static/js/app.js"></script>
|
||||
</body>
|
||||
</html>
|
||||
Reference in New Issue
Block a user