Files
Termix/src/backend/database/database.ts
ZacharyZcR 2e5cba73a4 feat: enhance server stats widgets and fix TypeScript/ESLint errors (#394)
* feat: add draggable server stats dashboard with customizable widgets

* fix: widget deletion and layout persistence issues

* fix: improve widget deletion UX and add debug logs for persistence

* fix: resolve widget deletion and layout persistence issues

- Add drag handles to widget title bars for precise drag control
- Prevent delete button from triggering drag via event stopPropagation
- Include statsConfig field in all GET/PUT API responses
- Remove debug console logs from production code

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: complete statsConfig field support across all API routes

- Add statsConfig to POST /db/host (create) route
- Add statsConfig to all GET routes for consistent API responses
- Remove incorrect statsConfig schema from HostManagerEditor
- statsConfig is now only managed by Server page layout editor

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: add statsConfig to metrics API response

- Add statsConfig field to SSHHostWithCredentials interface
- Include statsConfig in resolveHostCredentials baseHost object
- Ensures /metrics/:id API returns complete host configuration

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: include statsConfig in SSH host create/update requests

The statsConfig field was being dropped by createSSHHost and updateSSHHost
functions in main-axios.ts, preventing layout customization from persisting.

Fixed by adding statsConfig to the submitData object in both functions.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* feat: refactor server stats widgets into modular structure

Created dedicated widgets directory with individual components:
- CpuWidget, MemoryWidget, DiskWidget as separate components
- Widget registry for centralized widget configuration
- AddWidgetDialog for user-friendly widget selection
- Updated Server.tsx to use modular widget system

Benefits:
- Better code organization and maintainability
- Easier to add new widget types in the future
- Centralized widget metadata and configuration
- User can now add widgets via dialog interface

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: exit edit mode after saving layout

* feat: add customizable widget sizes with chart visualizations

Add three-tier size system (small/medium/large) for server stats widgets.
Integrate recharts library for visualizing trends in large widgets with
line charts (CPU), area charts (Memory), and radial bar charts (Disk).
Fix layout overflow issues with proper flexbox patterns.

* refactor: simplify server stats widget system

Replaced complex drag-and-drop grid layout with simple checkbox-based
configuration and static responsive grid display.

- Removed react-grid-layout dependency and 6 related packages
- Simplified StatsConfig from complex Widget objects to simple array
- Added Statistics tab in HostManagerEditor for checkbox selection
- Refactored Server.tsx to use CSS Grid instead of ResponsiveGridLayout
- Simplified widget components by removing edit mode and size selection
- Deleted unused AddWidgetDialog and registry files
- Fixed statsConfig serialization in backend routes

Net result: -787 lines of code, cleaner architecture.

* feat: add system, uptime, network and processes widgets

Add four new server statistics widgets:
- SystemWidget: displays hostname, OS, and kernel information
- UptimeWidget: shows server total uptime with formatted display
- NetworkWidget: lists network interfaces with IP and status
- ProcessesWidget: displays top processes by CPU usage

Backend changes:
- Extended SSH metrics collection to gather network, uptime, process, and system data
- Added commands to parse /proc/uptime, ip addr, ps aux output

Frontend changes:
- Created 4 new widget components with consistent styling
- Updated widget type definitions and HostManagerEditor
- Unified all widget heights to 280px for consistent layout
- Added translations for all new widgets (EN/ZH)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* refactor: improve widget styling and UX consistency

Enhance all server stats widgets with improved styling and user experience:

Widget improvements:
- Fix hardcoded titles, now use i18n translations for all widgets
- Improve data formatting with consistent translation keys
- Enhance empty state displays with better visual hierarchy
- Add smooth hover transitions and visual feedback
- Standardize spacing and layout patterns across widgets

Specific optimizations:
- CPU: Use translated load average display
- Memory: Translate "Free" label
- Disk: Translate "Available" label
- System: Improve icon colors and spacing consistency
- Network: Better empty state, enhanced card styling
- Processes: Improved card borders and spacing

Visual polish:
- Unified icon sizing and opacity for empty states
- Consistent border radius (rounded-lg)
- Better hover states with subtle transitions
- Enhanced font weights for improved readability

* fix: replace explicit any types with proper TypeScript types

- Replace 'any' with 'unknown' in catch blocks and add type assertions
- Create explicit interfaces for complex objects (HostConfig, TabData, TerminalHandle)
- Fix window/document object type extensions
- Update Electron API type definitions
- Improve type safety in database routes and utilities
- Add proper types to Terminal components (Desktop & Mobile)
- Fix navigation component types (TopNavbar, LeftSidebar, AppView)

Reduces TypeScript lint errors from 394 to 358 (-36 errors)
Fixes 45 @typescript-eslint/no-explicit-any violations

* fix: replace explicit any types with proper TypeScript types

- Create explicit interfaces for Request extensions (AuthenticatedRequest, RequestWithHeaders)
- Add type definitions for WebSocket messages and SSH connection data
- Use generic types in DataCrypto methods instead of any return types
- Define proper interfaces for file manager data structures
- Replace catch block any types with unknown and proper type assertions
- Add HostConfig and TabData interfaces for Server component

Fixes 32 @typescript-eslint/no-explicit-any violations across 5 files

* fix: resolve 6 TypeScript compilation errors

Fixed field name mismatches and generic type issues:
- database.ts: Changed camelCase to snake_case for key_password, private_key, public_key fields
- simple-db-ops.ts: Added explicit generic type parameters to DataCrypto method calls

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: resolve unused variables in backend utils

Fixed @typescript-eslint/no-unused-vars errors in:
- starter.ts: removed unused error variables (2 fixes)
- auto-ssl-setup.ts: removed unused error variable (1 fix)
- ssh-key-utils.ts: removed unused error variables (3 fixes)
- user-crypto.ts: removed unused error variables (5 fixes)
- data-crypto.ts: removed unused plaintextFields and error variables (2 fixes)
- simple-db-ops.ts: removed unused parameters _userId and _tableName (2 fixes)

Total: 15 unused variable errors fixed

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: remove unused variable in terminal.ts

Fixed @typescript-eslint/no-unused-vars errors:
- Removed unused userPayload variable (line 123)
- Removed unused cols and rows from destructuring (line 348)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: resolve unused variables in server-stats.ts

Fixed @typescript-eslint/no-unused-vars errors:
- Removed unused _reject parameter in Promise (line 64)
- Removed shadowed now variable in pollStatusesOnce (line 1130)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: remove unused variables in tunnel.ts

Removed 5 unused variables:
- Removed unused data parameter from stdout event handler
- Removed hasSourcePassword, hasSourceKey, hasEndpointPassword, hasEndpointKey variables

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: remove unused variables in main-axios.ts

Removed 8 unused variables:
- Removed unused type imports (Credential, CredentialData, HostInfo, ApiResponse)
- Removed unused apiPort variable
- Removed unused error variables in 3 catch blocks

* fix: remove unused variables in terminal.ts and starter.ts

Removed 2 unused variables:
- Removed unused JWTPayload type import from terminal.ts
- Removed unused _promise parameter from starter.ts

* fix: remove unused variables in sidebar.tsx

Removed 9 unused variables:
- Removed 5 unused Sheet component imports
- Removed unused SIDEBAR_WIDTH_MOBILE constant
- Removed 3 unused variables from useSidebar destructuring

* fix: remove 13 unused variables in frontend files

- version-check-modal.tsx: removed 4 unused imports and functions
- main.tsx: removed unused isMobile state
- AdminSettings.tsx: removed 8 unused imports and error variables

* fix: remove 28 unused variables across frontend components

Cleaned up unused imports, state variables, and function parameters:
- CredentialsManager.tsx: removed 8 unused variables (Sheet/Select imports)
- FileManager.tsx: removed 10 unused variables (icons, state, functions)
- Terminal.tsx (Desktop): removed 5 unused variables (state, handlers)
- Terminal.tsx (Mobile): removed 5 unused variables (imports, state)

Reduced lint errors from 271 to 236 (35 errors fixed)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: remove 10 unused variables in File Manager and config files

Cleaned up more unused imports, parameters, and variables:
- FileManagerGrid.tsx: removed 4 unused variables (params, function)
- FileManagerContextMenu.tsx: removed Share import
- FileManagerSidebar.tsx: removed onLoadDirectory parameter
- DraggableWindow.tsx: removed Square import
- FileWindow.tsx: removed updateWindow variable
- ServerConfig.tsx: removed 2 unused error parameters

Reduced lint errors from 236 to 222 (14 errors fixed total)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: remove 7 unused variables in widgets and Homepage components

Cleaned up unused imports, parameters, and variables:
- DiskWidget.tsx: removed metricsHistory parameter
- FileManagerContextMenu.tsx: removed ExternalLink import
- Homepage.tsx: removed useTranslation import
- HomepageAlertManager.tsx: removed loading variable
- HomepageAuth.tsx: removed setCookie import (Desktop & Mobile)
- HompageUpdateLog.tsx: removed err parameter

Reduced lint errors from 222 to 216 (6 errors fixed)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: remove 8 unused variables in File Manager and Host Manager components

Cleaned up unused imports, state variables, and function parameters:
- DiffViewer.tsx: removed unused error parameter in catch block
- FileViewer.tsx: removed ReactPlayer import, unused originalContent state,
  node parameters from markdown code components, audio variable
- HostManager.tsx: removed onSelectView and updatedHost parameters
- TunnelViewer.tsx: removed TunnelConnection import

Reduced lint errors from 271 to 208 (63 errors fixed total)

* fix: remove 7 unused variables in UI hooks and components

Cleaned up unused parameters and functions:
- status/index.tsx: removed unused className parameter from StatusIndicator
- useDragToDesktop.ts: removed unused sshHost parameter and from dependency
  arrays (4 occurrences)
- useDragToSystemDesktop.ts: removed unused sshHost parameter and
  getLastSaveDirectory function (29 lines removed)

Continued reducing frontend lint errors

* fix: remove 2 unused variables in hooks and TabContext

- useDragToDesktop.ts: removed unused onSuccess in dragFolderToDesktop
- TabContext.tsx: removed unused useTranslation import and t variable

Continued reducing frontend lint errors

* fix: remove 2 unused variables in Homepage component

- Removed unused isAdmin state variable (changed to setter only)
- Removed unused jwt variable by inlining getCookie check

Continued reducing frontend lint errors

* fix: remove 3 unused variables in Mobile navigation components

- Host.tsx: removed unused Server icon import
- LeftSidebar.tsx: removed unused setHostsLoading setter and err parameter

Continued reducing frontend lint errors

* fix: remove 9 unused variables across multiple files

Fixed unused variables in:
- database-file-encryption.ts: removed currentFingerprint (backend)
- FileManagerContextMenu.tsx: removed ExternalLink import, hasDirectories
- frontend-logger.ts: removed 5 unused shortUrl variables

Continued reducing lint errors

* fix: remove 18 unused variables across 4 files

- HostManagerViewer.tsx: remove 9 unused error variables and parameters
- HostManagerEditor.tsx: remove WidgetType import, hosts/loading states, error variable
- CredentialViewer.tsx: remove 3 unused error variables
- Server.tsx: remove 2 unused error variables

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: remove 9 unused variables across 4 files

- SnippetsSidebar.tsx: remove 3 unused err variables in catch blocks
- TunnelViewer.tsx: remove 2 unused parameters from callback
- DesktopApp.tsx: remove getCookie import and unused state variables
- HomepageAlertManager.tsx: remove 2 unused err variables in catch blocks

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: remove 10 unused variables and imports across 4 navigation files

- Homepage.tsx: remove unused username state variable
- AppView.tsx: remove 3 unused Lucide icon imports
- TopNavbar.tsx: remove 4 unused Accordion component imports
- LeftSidebar.tsx: remove 2 unused variables (err, jwt)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: remove 5 unused variables across 4 user/credentials files

- PasswordReset.tsx: remove unused result variable
- UserProfile.tsx: remove unused Key import and err variable
- version-check-modal.tsx: remove unused setVersionDismissed setter
- CredentialsManager.tsx: remove unused e parameter from handleDragLeave

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: remove 2 unused variables in FileViewer and TerminalWindow

- FileViewer.tsx: remove unused node parameter from code component
- TerminalWindow.tsx: remove unused handleMinimize function

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: remove 10 unused variables in HomepageAuth.tsx

Removed unused variables:
- getCookie import
- dbError prop
- visibility state and toggleVisibility
- error state variable
- result variable in handleInitiatePasswordReset
- token URL parameter
- err parameters in catch blocks
- retryDatabaseConnection function
- Multiple setError(null) calls

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: remove 9 unused variables across multiple files

Files fixed:
- DesktopApp.tsx: Removed _nextView parameter
- TerminalWindow.tsx: Removed minimizeWindow
- Mobile Host.tsx: Removed Server import
- Mobile LeftSidebar.tsx: Removed setHostsLoading, err in catch
- Desktop LeftSidebar.tsx: Removed getCookie, setCookie, onSelectView, getView, setHostsLoading

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: remove 10 unused variables in Mobile files

Files fixed:
- MobileApp.tsx: Removed getCookie, removeTab, isAdmin, id, err parameters
- Mobile/HomepageAuth.tsx: Removed getCookie, error state, result, token, err parameters

All @typescript-eslint/no-unused-vars errors in frontend now resolved!

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: remove unused t variable in TabContext

Removed useTranslation import and unused t variable
in Mobile TabContext.tsx

All @typescript-eslint/no-unused-vars errors now resolved!
Total fixed: 154 unused variables

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: resolve TypeScript and ESLint errors across the codebase

- Fixed @typescript-eslint/no-unused-vars errors (31 instances)
- Fixed @typescript-eslint/no-explicit-any errors in backend (~22 instances)
- Fixed @typescript-eslint/no-explicit-any errors in frontend (~60 instances)
- Fixed prefer-const errors (5 instances)
- Fixed no-empty-object-type and rules-of-hooks errors
- Added proper type assertions for database operations
- Improved type safety in authentication and encryption modules
- Enhanced type definitions for API routes and SSH operations

All TypeScript compilation errors resolved. Application builds and runs successfully.

* fix: disable react-refresh/only-export-components rule for component files

Disable the react-refresh/only-export-components ESLint rule in files
that export both components and related utilities (hooks, types,
constants). This is a pragmatic solution to maintain code organization
without splitting files unnecessarily.

* style: fix prettier formatting issues

Fix code style issues in translation file and TOTP dialog component
to pass CI prettier check.

* chore: fix rollup optional dependencies installation in CI

Add step to force reinstall rollup after npm ci to fix the known npm
bug with optional dependencies on Linux x64 platform.

* chore: fix lightningcss optional dependencies in CI

Add lightningcss to the force reinstall step to fix npm optional
dependencies bug for both rollup and lightningcss on Linux x64.

* chore: fix npm optional dependencies bug in CI

Remove package-lock.json and node_modules before install to properly
handle optional dependencies for rollup, lightningcss, and tailwindcss
native bindings on Linux x64 platform as recommended by npm.

* Update src/types/index.ts

Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>

* Set terminal environment variables for SSH

Added environment variables for terminal configuration.

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Karmaa <88517757+LukeGus@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-14 20:46:36 -05:00

1572 lines
46 KiB
TypeScript

import express from "express";
import bodyParser from "body-parser";
import multer from "multer";
import cookieParser from "cookie-parser";
import userRoutes from "./routes/users.js";
import sshRoutes from "./routes/ssh.js";
import alertRoutes from "./routes/alerts.js";
import credentialsRoutes from "./routes/credentials.js";
import snippetsRoutes from "./routes/snippets.js";
import cors from "cors";
import fetch from "node-fetch";
import fs from "fs";
import path from "path";
import os from "os";
import "dotenv/config";
import { databaseLogger, apiLogger } from "../utils/logger.js";
import { AuthManager } from "../utils/auth-manager.js";
import { DataCrypto } from "../utils/data-crypto.js";
import { DatabaseFileEncryption } from "../utils/database-file-encryption.js";
import { DatabaseMigration } from "../utils/database-migration.js";
import { UserDataExport } from "../utils/user-data-export.js";
import { AutoSSLSetup } from "../utils/auto-ssl-setup.js";
import { eq, and } from "drizzle-orm";
import {
users,
sshData,
sshCredentials,
fileManagerRecent,
fileManagerPinned,
fileManagerShortcuts,
dismissedAlerts,
sshCredentialUsage,
settings,
} from "./db/schema.js";
import type {
CacheEntry,
GitHubRelease,
GitHubAPIResponse,
AuthenticatedRequest,
} from "../../types/index.js";
import { getDb } from "./db/index.js";
import Database from "better-sqlite3";
const app = express();
app.set("trust proxy", true);
const authManager = AuthManager.getInstance();
const authenticateJWT = authManager.createAuthMiddleware();
const requireAdmin = authManager.createAdminMiddleware();
app.use(
cors({
origin: (origin, callback) => {
if (!origin) return callback(null, true);
const allowedOrigins = [
"http://localhost:5173",
"http://localhost:3000",
"http://127.0.0.1:5173",
"http://127.0.0.1:3000",
];
if (origin.startsWith("https://")) {
return callback(null, true);
}
if (origin.startsWith("http://")) {
return callback(null, true);
}
if (allowedOrigins.includes(origin)) {
return callback(null, true);
}
callback(new Error("Not allowed by CORS"));
},
credentials: true,
methods: ["GET", "POST", "PUT", "PATCH", "DELETE", "OPTIONS"],
allowedHeaders: [
"Content-Type",
"Authorization",
"User-Agent",
"X-Electron-App",
],
}),
);
const storage = multer.diskStorage({
destination: (req, file, cb) => {
cb(null, "uploads/");
},
filename: (req, file, cb) => {
const timestamp = Date.now();
cb(null, `${timestamp}-${file.originalname}`);
},
});
const upload = multer({
storage: storage,
limits: {
fileSize: 1024 * 1024 * 1024,
},
fileFilter: (req, file, cb) => {
if (
file.originalname.endsWith(".termix-export.sqlite") ||
file.originalname.endsWith(".sqlite")
) {
cb(null, true);
} else {
cb(new Error("Only .termix-export.sqlite files are allowed"));
}
},
});
class GitHubCache {
private cache: Map<string, CacheEntry> = new Map();
private readonly CACHE_DURATION = 30 * 60 * 1000;
set<T>(key: string, data: T): void {
const now = Date.now();
this.cache.set(key, {
data,
timestamp: now,
expiresAt: now + this.CACHE_DURATION,
});
}
get<T>(key: string): T | null {
const entry = this.cache.get(key);
if (!entry) {
return null;
}
if (Date.now() > entry.expiresAt) {
this.cache.delete(key);
return null;
}
return entry.data as T;
}
}
const githubCache = new GitHubCache();
const GITHUB_API_BASE = "https://api.github.com";
const REPO_OWNER = "Termix-SSH";
const REPO_NAME = "Termix";
async function fetchGitHubAPI<T>(
endpoint: string,
cacheKey: string,
): Promise<GitHubAPIResponse<T>> {
const cachedEntry = githubCache.get<CacheEntry<T>>(cacheKey);
if (cachedEntry) {
return {
data: cachedEntry.data,
cached: true,
cache_age: Date.now() - cachedEntry.timestamp,
};
}
try {
const response = await fetch(`${GITHUB_API_BASE}${endpoint}`, {
headers: {
Accept: "application/vnd.github+json",
"User-Agent": "TermixUpdateChecker/1.0",
"X-GitHub-Api-Version": "2022-11-28",
},
});
if (!response.ok) {
throw new Error(
`GitHub API error: ${response.status} ${response.statusText}`,
);
}
const data = (await response.json()) as T;
const cacheData: CacheEntry<T> = {
data,
timestamp: Date.now(),
expiresAt: Date.now() + 30 * 60 * 1000,
};
githubCache.set(cacheKey, cacheData);
return {
data: data,
cached: false,
};
} catch (error) {
databaseLogger.error(`Failed to fetch from GitHub API`, error, {
operation: "github_api",
endpoint,
});
throw error;
}
}
app.use(bodyParser.json({ limit: "1gb" }));
app.use(bodyParser.urlencoded({ limit: "1gb", extended: true }));
app.use(bodyParser.raw({ limit: "5gb", type: "application/octet-stream" }));
app.use(cookieParser());
app.get("/health", (req, res) => {
res.json({ status: "ok" });
});
app.get("/version", authenticateJWT, async (req, res) => {
let localVersion = process.env.VERSION;
if (!localVersion) {
const versionSources = [
() => {
try {
const packagePath = path.resolve(process.cwd(), "package.json");
const packageJson = JSON.parse(fs.readFileSync(packagePath, "utf8"));
return packageJson.version;
} catch {
return null;
}
},
() => {
try {
const packagePath = path.resolve("/app", "package.json");
const packageJson = JSON.parse(fs.readFileSync(packagePath, "utf8"));
return packageJson.version;
} catch {
return null;
}
},
() => {
try {
const packagePath = path.resolve(__dirname, "../../../package.json");
const packageJson = JSON.parse(fs.readFileSync(packagePath, "utf8"));
return packageJson.version;
} catch {
return null;
}
},
];
for (const getVersion of versionSources) {
try {
const foundVersion = getVersion();
if (foundVersion && foundVersion !== "unknown") {
localVersion = foundVersion;
break;
}
} catch {
continue;
}
}
}
if (!localVersion) {
databaseLogger.error("No version information available", undefined, {
operation: "version_check",
});
return res.status(404).send("Local Version Not Set");
}
try {
const cacheKey = "latest_release";
const releaseData = await fetchGitHubAPI<GitHubRelease>(
`/repos/${REPO_OWNER}/${REPO_NAME}/releases/latest`,
cacheKey,
);
const rawTag = releaseData.data.tag_name || releaseData.data.name || "";
const remoteVersionMatch = rawTag.match(/(\d+\.\d+(\.\d+)?)/);
const remoteVersion = remoteVersionMatch ? remoteVersionMatch[1] : null;
if (!remoteVersion) {
databaseLogger.warn("Remote version not found in GitHub response", {
operation: "version_check",
rawTag,
});
return res.status(401).send("Remote Version Not Found");
}
const isUpToDate = localVersion === remoteVersion;
const response = {
status: isUpToDate ? "up_to_date" : "requires_update",
localVersion: localVersion,
version: remoteVersion,
latest_release: {
tag_name: releaseData.data.tag_name,
name: releaseData.data.name,
published_at: releaseData.data.published_at,
html_url: releaseData.data.html_url,
},
cached: releaseData.cached,
cache_age: releaseData.cache_age,
};
res.json(response);
} catch (err) {
databaseLogger.error("Version check failed", err, {
operation: "version_check",
});
res.status(500).send("Fetch Error");
}
});
app.get("/releases/rss", authenticateJWT, async (req, res) => {
try {
const page = parseInt(req.query.page as string) || 1;
const per_page = Math.min(
parseInt(req.query.per_page as string) || 20,
100,
);
const cacheKey = `releases_rss_${page}_${per_page}`;
const releasesData = await fetchGitHubAPI<GitHubRelease[]>(
`/repos/${REPO_OWNER}/${REPO_NAME}/releases?page=${page}&per_page=${per_page}`,
cacheKey,
);
const rssItems = releasesData.data.map((release) => ({
id: release.id,
title: release.name || release.tag_name,
description: release.body,
link: release.html_url,
pubDate: release.published_at,
version: release.tag_name,
isPrerelease: release.prerelease,
isDraft: release.draft,
assets: release.assets.map((asset) => ({
name: asset.name,
size: asset.size,
download_count: asset.download_count,
download_url: asset.browser_download_url,
})),
}));
const response = {
feed: {
title: `${REPO_NAME} Releases`,
description: `Latest releases from ${REPO_NAME} repository`,
link: `https://github.com/${REPO_OWNER}/${REPO_NAME}/releases`,
updated: new Date().toISOString(),
},
items: rssItems,
total_count: rssItems.length,
cached: releasesData.cached,
cache_age: releasesData.cache_age,
};
res.json(response);
} catch (error) {
databaseLogger.error("Failed to generate RSS format", error, {
operation: "rss_releases",
});
res.status(500).json({
error: "Failed to generate RSS format",
details: error instanceof Error ? error.message : "Unknown error",
});
}
});
app.get("/encryption/status", requireAdmin, async (req, res) => {
try {
const securityStatus = {
initialized: true,
system: { hasSecret: true, isValid: true },
activeSessions: {},
activeSessionCount: 0,
};
res.json({
security: securityStatus,
version: "v2-kek-dek",
});
} catch (error) {
apiLogger.error("Failed to get security status", error, {
operation: "security_status",
});
res.status(500).json({ error: "Failed to get security status" });
}
});
app.post("/encryption/initialize", requireAdmin, async (req, res) => {
try {
const authManager = AuthManager.getInstance();
const isValid = true;
if (!isValid) {
await authManager.initialize();
}
res.json({
success: true,
message: "Security system initialized successfully",
version: "v2-kek-dek",
note: "User data encryption will be set up when users log in",
});
} catch (error) {
apiLogger.error("Failed to initialize security system", error, {
operation: "security_init_api_failed",
});
res.status(500).json({ error: "Failed to initialize security system" });
}
});
app.post("/encryption/regenerate", requireAdmin, async (req, res) => {
try {
apiLogger.warn("System JWT secret regenerated via API", {
operation: "jwt_regenerate_api",
});
res.json({
success: true,
message: "System JWT secret regenerated",
warning:
"All existing JWT tokens are now invalid - users must re-authenticate",
note: "User data encryption keys are protected by passwords and cannot be regenerated",
});
} catch (error) {
apiLogger.error("Failed to regenerate JWT secret", error, {
operation: "jwt_regenerate_failed",
});
res.status(500).json({ error: "Failed to regenerate JWT secret" });
}
});
app.post("/encryption/regenerate-jwt", requireAdmin, async (req, res) => {
try {
apiLogger.warn("JWT secret regenerated via API", {
operation: "jwt_secret_regenerate_api",
});
res.json({
success: true,
message: "New JWT secret generated",
warning:
"All existing JWT tokens are now invalid - users must re-authenticate",
});
} catch (error) {
apiLogger.error("Failed to regenerate JWT secret", error, {
operation: "jwt_secret_regenerate_failed",
});
res.status(500).json({ error: "Failed to regenerate JWT secret" });
}
});
app.post("/database/export", authenticateJWT, async (req, res) => {
try {
const userId = (req as AuthenticatedRequest).userId;
const { password } = req.body;
if (!password) {
return res.status(400).json({
error: "Password required for export",
code: "PASSWORD_REQUIRED",
});
}
const unlocked = await authManager.authenticateUser(userId, password);
if (!unlocked) {
return res.status(401).json({ error: "Invalid password" });
}
apiLogger.info("Exporting user data as SQLite", {
operation: "user_data_sqlite_export_api",
userId,
});
const userDataKey = DataCrypto.getUserDataKey(userId);
if (!userDataKey) {
throw new Error("User data not unlocked");
}
const user = await getDb().select().from(users).where(eq(users.id, userId));
if (!user || user.length === 0) {
throw new Error(`User not found: ${userId}`);
}
const tempDir =
process.env.NODE_ENV === "production"
? path.join(process.env.DATA_DIR || "./db/data", ".temp", "exports")
: path.join(os.tmpdir(), "termix-exports");
try {
if (!fs.existsSync(tempDir)) {
fs.mkdirSync(tempDir, { recursive: true });
}
} catch (dirError) {
apiLogger.error("Failed to create temp directory", dirError, {
operation: "export_temp_dir_error",
tempDir,
});
throw new Error(`Failed to create temp directory: ${dirError.message}`);
}
const timestamp = new Date().toISOString().replace(/[:.]/g, "-");
const filename = `termix-export-${user[0].username}-${timestamp}.sqlite`;
const tempPath = path.join(tempDir, filename);
apiLogger.info("Creating export database", {
operation: "export_db_creation",
userId,
tempPath,
});
const exportDb = new Database(tempPath);
try {
exportDb.exec(`
CREATE TABLE users (
id TEXT PRIMARY KEY,
username TEXT NOT NULL,
password_hash TEXT NOT NULL,
is_admin INTEGER NOT NULL DEFAULT 0,
is_oidc INTEGER NOT NULL DEFAULT 0,
oidc_identifier TEXT,
client_id TEXT,
client_secret TEXT,
issuer_url TEXT,
authorization_url TEXT,
token_url TEXT,
identifier_path TEXT,
name_path TEXT,
scopes TEXT DEFAULT 'openid email profile',
totp_secret TEXT,
totp_enabled INTEGER NOT NULL DEFAULT 0,
totp_backup_codes TEXT
);
CREATE TABLE settings (
key TEXT PRIMARY KEY,
value TEXT NOT NULL
);
CREATE TABLE ssh_data (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id TEXT NOT NULL,
name TEXT,
ip TEXT NOT NULL,
port INTEGER NOT NULL,
username TEXT NOT NULL,
folder TEXT,
tags TEXT,
pin INTEGER NOT NULL DEFAULT 0,
auth_type TEXT NOT NULL,
password TEXT,
key TEXT,
key_password TEXT,
key_type TEXT,
autostart_password TEXT,
autostart_key TEXT,
autostart_key_password TEXT,
credential_id INTEGER,
enable_terminal INTEGER NOT NULL DEFAULT 1,
enable_tunnel INTEGER NOT NULL DEFAULT 1,
tunnel_connections TEXT,
enable_file_manager INTEGER NOT NULL DEFAULT 1,
default_path TEXT,
created_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP,
updated_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP
);
CREATE TABLE ssh_credentials (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id TEXT NOT NULL,
name TEXT NOT NULL,
description TEXT,
folder TEXT,
tags TEXT,
auth_type TEXT NOT NULL,
username TEXT NOT NULL,
password TEXT,
key TEXT,
private_key TEXT,
public_key TEXT,
key_password TEXT,
key_type TEXT,
detected_key_type TEXT,
usage_count INTEGER NOT NULL DEFAULT 0,
last_used TEXT,
created_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP,
updated_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP
);
CREATE TABLE file_manager_recent (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id TEXT NOT NULL,
host_id INTEGER NOT NULL,
name TEXT NOT NULL,
path TEXT NOT NULL,
last_opened TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP
);
CREATE TABLE file_manager_pinned (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id TEXT NOT NULL,
host_id INTEGER NOT NULL,
name TEXT NOT NULL,
path TEXT NOT NULL,
pinned_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP
);
CREATE TABLE file_manager_shortcuts (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id TEXT NOT NULL,
host_id INTEGER NOT NULL,
name TEXT NOT NULL,
path TEXT NOT NULL,
created_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP
);
CREATE TABLE dismissed_alerts (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id TEXT NOT NULL,
alert_id TEXT NOT NULL,
dismissed_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP
);
CREATE TABLE ssh_credential_usage (
id INTEGER PRIMARY KEY AUTOINCREMENT,
credential_id INTEGER NOT NULL,
host_id INTEGER NOT NULL,
user_id TEXT NOT NULL,
used_at TEXT NOT NULL DEFAULT CURRENT_TIMESTAMP
);
`);
const userRecord = user[0];
const insertUser = exportDb.prepare(`
INSERT INTO users (id, username, password_hash, is_admin, is_oidc, oidc_identifier, client_id, client_secret, issuer_url, authorization_url, token_url, identifier_path, name_path, scopes, totp_secret, totp_enabled, totp_backup_codes)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`);
insertUser.run(
userRecord.id,
userRecord.username,
"[EXPORTED_USER_NO_PASSWORD]",
userRecord.is_admin ? 1 : 0,
userRecord.is_oidc ? 1 : 0,
userRecord.oidc_identifier || null,
userRecord.client_id || null,
userRecord.client_secret || null,
userRecord.issuer_url || null,
userRecord.authorization_url || null,
userRecord.token_url || null,
userRecord.identifier_path || null,
userRecord.name_path || null,
userRecord.scopes || null,
userRecord.totp_secret || null,
userRecord.totp_enabled ? 1 : 0,
userRecord.totp_backup_codes || null,
);
const sshHosts = await getDb()
.select()
.from(sshData)
.where(eq(sshData.userId, userId));
const insertHost = exportDb.prepare(`
INSERT INTO ssh_data (id, user_id, name, ip, port, username, folder, tags, pin, auth_type, password, key, key_password, key_type, autostart_password, autostart_key, autostart_key_password, credential_id, enable_terminal, enable_tunnel, tunnel_connections, enable_file_manager, default_path, created_at, updated_at)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`);
for (const host of sshHosts) {
const decrypted = DataCrypto.decryptRecord(
"ssh_data",
host,
userId,
userDataKey,
);
insertHost.run(
decrypted.id,
decrypted.userId,
decrypted.name || null,
decrypted.ip,
decrypted.port,
decrypted.username,
decrypted.folder || null,
decrypted.tags || null,
decrypted.pin ? 1 : 0,
decrypted.authType,
decrypted.password || null,
decrypted.key || null,
decrypted.key_password || null,
decrypted.keyType || null,
decrypted.autostartPassword || null,
decrypted.autostartKey || null,
decrypted.autostartKeyPassword || null,
decrypted.credentialId || null,
decrypted.enableTerminal ? 1 : 0,
decrypted.enableTunnel ? 1 : 0,
decrypted.tunnelConnections || null,
decrypted.enableFileManager ? 1 : 0,
decrypted.defaultPath || null,
decrypted.createdAt,
decrypted.updatedAt,
);
}
const credentials = await getDb()
.select()
.from(sshCredentials)
.where(eq(sshCredentials.userId, userId));
const insertCred = exportDb.prepare(`
INSERT INTO ssh_credentials (id, user_id, name, description, folder, tags, auth_type, username, password, key, private_key, public_key, key_password, key_type, detected_key_type, usage_count, last_used, created_at, updated_at)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`);
for (const cred of credentials) {
const decrypted = DataCrypto.decryptRecord(
"ssh_credentials",
cred,
userId,
userDataKey,
);
insertCred.run(
decrypted.id,
decrypted.userId,
decrypted.name,
decrypted.description || null,
decrypted.folder || null,
decrypted.tags || null,
decrypted.authType,
decrypted.username,
decrypted.password || null,
decrypted.key || null,
decrypted.private_key || null,
decrypted.public_key || null,
decrypted.key_password || null,
decrypted.keyType || null,
decrypted.detectedKeyType || null,
decrypted.usageCount || 0,
decrypted.lastUsed || null,
decrypted.createdAt,
decrypted.updatedAt,
);
}
const [recentFiles, pinnedFiles, shortcuts] = await Promise.all([
getDb()
.select()
.from(fileManagerRecent)
.where(eq(fileManagerRecent.userId, userId)),
getDb()
.select()
.from(fileManagerPinned)
.where(eq(fileManagerPinned.userId, userId)),
getDb()
.select()
.from(fileManagerShortcuts)
.where(eq(fileManagerShortcuts.userId, userId)),
]);
const insertRecent = exportDb.prepare(`
INSERT INTO file_manager_recent (id, user_id, host_id, name, path, last_opened)
VALUES (?, ?, ?, ?, ?, ?)
`);
for (const item of recentFiles) {
insertRecent.run(
item.id,
item.userId,
item.hostId,
item.name,
item.path,
item.lastOpened,
);
}
const insertPinned = exportDb.prepare(`
INSERT INTO file_manager_pinned (id, user_id, host_id, name, path, pinned_at)
VALUES (?, ?, ?, ?, ?, ?)
`);
for (const item of pinnedFiles) {
insertPinned.run(
item.id,
item.userId,
item.hostId,
item.name,
item.path,
item.pinnedAt,
);
}
const insertShortcut = exportDb.prepare(`
INSERT INTO file_manager_shortcuts (id, user_id, host_id, name, path, created_at)
VALUES (?, ?, ?, ?, ?, ?)
`);
for (const item of shortcuts) {
insertShortcut.run(
item.id,
item.userId,
item.hostId,
item.name,
item.path,
item.createdAt,
);
}
const alerts = await getDb()
.select()
.from(dismissedAlerts)
.where(eq(dismissedAlerts.userId, userId));
const insertAlert = exportDb.prepare(`
INSERT INTO dismissed_alerts (id, user_id, alert_id, dismissed_at)
VALUES (?, ?, ?, ?)
`);
for (const alert of alerts) {
insertAlert.run(
alert.id,
alert.userId,
alert.alertId,
alert.dismissedAt,
);
}
const usage = await getDb()
.select()
.from(sshCredentialUsage)
.where(eq(sshCredentialUsage.userId, userId));
const insertUsage = exportDb.prepare(`
INSERT INTO ssh_credential_usage (id, credential_id, host_id, user_id, used_at)
VALUES (?, ?, ?, ?, ?)
`);
for (const item of usage) {
insertUsage.run(
item.id,
item.credentialId,
item.hostId,
item.userId,
item.usedAt,
);
}
const settingsData = await getDb().select().from(settings);
const insertSetting = exportDb.prepare(`
INSERT INTO settings (key, value)
VALUES (?, ?)
`);
for (const setting of settingsData) {
insertSetting.run(setting.key, setting.value);
}
} finally {
exportDb.close();
}
res.setHeader("Content-Type", "application/x-sqlite3");
res.setHeader("Content-Disposition", `attachment; filename="${filename}"`);
const fileStream = fs.createReadStream(tempPath);
fileStream.on("error", (streamError) => {
apiLogger.error("File stream error during export", streamError, {
operation: "export_file_stream_error",
userId,
tempPath,
});
if (!res.headersSent) {
res.status(500).json({
error: "Failed to stream export file",
details: streamError.message,
});
}
});
fileStream.on("end", () => {
apiLogger.success("User data exported as SQLite successfully", {
operation: "user_data_sqlite_export_success",
userId,
filename,
});
fs.unlink(tempPath, (err) => {
if (err) {
apiLogger.warn("Failed to clean up export file", {
operation: "export_cleanup_failed",
path: tempPath,
error: err.message,
});
}
});
});
fileStream.pipe(res);
} catch (error) {
apiLogger.error("User data SQLite export failed", error, {
operation: "user_data_sqlite_export_failed",
});
res.status(500).json({
error: "Failed to export user data",
details: error instanceof Error ? error.message : "Unknown error",
});
}
});
app.post(
"/database/import",
authenticateJWT,
upload.single("file"),
async (req, res) => {
try {
if (!req.file) {
return res.status(400).json({ error: "No file uploaded" });
}
const userId = (req as AuthenticatedRequest).userId;
const { password } = req.body;
if (!password) {
return res.status(400).json({
error: "Password required for import",
code: "PASSWORD_REQUIRED",
});
}
const unlocked = await authManager.authenticateUser(userId, password);
if (!unlocked) {
return res.status(401).json({ error: "Invalid password" });
}
apiLogger.info("Importing SQLite data", {
operation: "sqlite_import_api",
userId,
filename: req.file.originalname,
fileSize: req.file.size,
mimetype: req.file.mimetype,
});
const userDataKey = DataCrypto.getUserDataKey(userId);
if (!userDataKey) {
throw new Error("User data not unlocked");
}
if (!fs.existsSync(req.file.path)) {
return res.status(400).json({
error: "Uploaded file not found",
details: "File was not properly uploaded",
});
}
const fileHeader = Buffer.alloc(16);
const fd = fs.openSync(req.file.path, "r");
fs.readSync(fd, fileHeader, 0, 16, 0);
fs.closeSync(fd);
const sqliteHeader = "SQLite format 3";
if (fileHeader.toString("utf8", 0, 15) !== sqliteHeader) {
return res.status(400).json({
error: "Invalid file format - not a SQLite database",
details: `Expected SQLite file, got file starting with: ${fileHeader.toString("utf8", 0, 15)}`,
});
}
let importDb;
try {
importDb = new Database(req.file.path, { readonly: true });
importDb
.prepare("SELECT name FROM sqlite_master WHERE type='table'")
.all();
} catch (sqliteError) {
return res.status(400).json({
error: "Failed to open SQLite database",
details: sqliteError.message,
});
}
const result = {
success: false,
summary: {
sshHostsImported: 0,
sshCredentialsImported: 0,
fileManagerItemsImported: 0,
dismissedAlertsImported: 0,
credentialUsageImported: 0,
settingsImported: 0,
skippedItems: 0,
errors: [],
},
};
try {
const mainDb = getDb();
try {
const importedHosts = importDb
.prepare("SELECT * FROM ssh_data")
.all();
for (const host of importedHosts) {
try {
const existing = await mainDb
.select()
.from(sshData)
.where(
and(
eq(sshData.userId, userId),
eq(sshData.ip, host.ip),
eq(sshData.port, host.port),
eq(sshData.username, host.username),
),
);
if (existing.length > 0) {
result.summary.skippedItems++;
continue;
}
const hostData = {
userId: userId,
name: host.name,
ip: host.ip,
port: host.port,
username: host.username,
folder: host.folder,
tags: host.tags,
pin: Boolean(host.pin),
authType: host.auth_type,
password: host.password,
key: host.key,
keyPassword: host.key_password,
keyType: host.key_type,
autostartPassword: host.autostart_password,
autostartKey: host.autostart_key,
autostartKeyPassword: host.autostart_key_password,
credentialId: null,
enableTerminal: Boolean(host.enable_terminal),
enableTunnel: Boolean(host.enable_tunnel),
tunnelConnections: host.tunnel_connections,
enableFileManager: Boolean(host.enable_file_manager),
defaultPath: host.default_path,
createdAt: host.created_at || new Date().toISOString(),
updatedAt: new Date().toISOString(),
};
const encrypted = DataCrypto.encryptRecord(
"ssh_data",
hostData,
userId,
userDataKey,
);
await mainDb.insert(sshData).values(encrypted);
result.summary.sshHostsImported++;
} catch (hostError) {
result.summary.errors.push(
`SSH host import error: ${hostError.message}`,
);
}
}
} catch {
apiLogger.info("ssh_data table not found in import file, skipping");
}
try {
const importedCreds = importDb
.prepare("SELECT * FROM ssh_credentials")
.all();
for (const cred of importedCreds) {
try {
const existing = await mainDb
.select()
.from(sshCredentials)
.where(
and(
eq(sshCredentials.userId, userId),
eq(sshCredentials.name, cred.name),
eq(sshCredentials.username, cred.username),
),
);
if (existing.length > 0) {
result.summary.skippedItems++;
continue;
}
const credData = {
userId: userId,
name: cred.name,
description: cred.description,
folder: cred.folder,
tags: cred.tags,
authType: cred.auth_type,
username: cred.username,
password: cred.password,
key: cred.key,
privateKey: cred.private_key,
publicKey: cred.public_key,
keyPassword: cred.key_password,
keyType: cred.key_type,
detectedKeyType: cred.detected_key_type,
usageCount: cred.usage_count || 0,
lastUsed: cred.last_used,
createdAt: cred.created_at || new Date().toISOString(),
updatedAt: new Date().toISOString(),
};
const encrypted = DataCrypto.encryptRecord(
"ssh_credentials",
credData,
userId,
userDataKey,
);
await mainDb.insert(sshCredentials).values(encrypted);
result.summary.sshCredentialsImported++;
} catch (credError) {
result.summary.errors.push(
`SSH credential import error: ${credError.message}`,
);
}
}
} catch {
apiLogger.info(
"ssh_credentials table not found in import file, skipping",
);
}
const fileManagerTables = [
{
table: "file_manager_recent",
schema: fileManagerRecent,
key: "fileManagerItemsImported",
},
{
table: "file_manager_pinned",
schema: fileManagerPinned,
key: "fileManagerItemsImported",
},
{
table: "file_manager_shortcuts",
schema: fileManagerShortcuts,
key: "fileManagerItemsImported",
},
];
for (const { table, schema, key } of fileManagerTables) {
try {
const importedItems = importDb
.prepare(`SELECT * FROM ${table}`)
.all();
for (const item of importedItems) {
try {
const existing = await mainDb
.select()
.from(schema)
.where(
and(
eq(schema.userId, userId),
eq(schema.path, item.path),
eq(schema.name, item.name),
),
);
if (existing.length > 0) {
result.summary.skippedItems++;
continue;
}
const itemData = {
userId: userId,
hostId: item.host_id,
name: item.name,
path: item.path,
...(table === "file_manager_recent" && {
lastOpened: item.last_opened,
}),
...(table === "file_manager_pinned" && {
pinnedAt: item.pinned_at,
}),
...(table === "file_manager_shortcuts" && {
createdAt: item.created_at,
}),
};
await mainDb.insert(schema).values(itemData);
result.summary[key]++;
} catch (itemError) {
result.summary.errors.push(
`${table} import error: ${itemError.message}`,
);
}
}
} catch {
apiLogger.info(`${table} table not found in import file, skipping`);
}
}
try {
const importedAlerts = importDb
.prepare("SELECT * FROM dismissed_alerts")
.all();
for (const alert of importedAlerts) {
try {
const existing = await mainDb
.select()
.from(dismissedAlerts)
.where(
and(
eq(dismissedAlerts.userId, userId),
eq(dismissedAlerts.alertId, alert.alert_id),
),
);
if (existing.length > 0) {
result.summary.skippedItems++;
continue;
}
await mainDb.insert(dismissedAlerts).values({
userId: userId,
alertId: alert.alert_id,
dismissedAt: alert.dismissed_at || new Date().toISOString(),
});
result.summary.dismissedAlertsImported++;
} catch (alertError) {
result.summary.errors.push(
`Dismissed alert import error: ${alertError.message}`,
);
}
}
} catch {
apiLogger.info(
"dismissed_alerts table not found in import file, skipping",
);
}
const targetUser = await mainDb
.select()
.from(users)
.where(eq(users.id, userId));
if (targetUser.length > 0 && targetUser[0].is_admin) {
try {
const importedSettings = importDb
.prepare("SELECT * FROM settings")
.all();
for (const setting of importedSettings) {
try {
const existing = await mainDb
.select()
.from(settings)
.where(eq(settings.key, setting.key));
if (existing.length > 0) {
await mainDb
.update(settings)
.set({ value: setting.value })
.where(eq(settings.key, setting.key));
result.summary.settingsImported++;
} else {
await mainDb.insert(settings).values({
key: setting.key,
value: setting.value,
});
result.summary.settingsImported++;
}
} catch (settingError) {
result.summary.errors.push(
`Setting import error (${setting.key}): ${settingError.message}`,
);
}
}
} catch {
apiLogger.info("settings table not found in import file, skipping");
}
} else {
apiLogger.info(
"Settings import skipped - only admin users can import settings",
);
}
result.success = true;
} finally {
if (importDb) {
importDb.close();
}
}
try {
fs.unlinkSync(req.file.path);
} catch {
apiLogger.warn("Failed to clean up uploaded file", {
operation: "file_cleanup_warning",
filePath: req.file.path,
});
}
res.json({
success: result.success,
message: result.success
? "Incremental import completed successfully"
: "Import failed",
summary: result.summary,
});
if (result.success) {
apiLogger.success("SQLite data imported successfully", {
operation: "sqlite_import_api_success",
userId,
summary: result.summary,
});
}
} catch (error) {
if (req.file?.path && fs.existsSync(req.file.path)) {
try {
fs.unlinkSync(req.file.path);
} catch {
apiLogger.warn("Failed to clean up uploaded file after error", {
operation: "file_cleanup_error",
filePath: req.file.path,
});
}
}
apiLogger.error("SQLite import failed", error, {
operation: "sqlite_import_api_failed",
userId: (req as AuthenticatedRequest).userId,
});
res.status(500).json({
error: "Failed to import SQLite data",
details: error instanceof Error ? error.message : "Unknown error",
});
}
},
);
app.post("/database/export/preview", authenticateJWT, async (req, res) => {
try {
const userId = (req as AuthenticatedRequest).userId;
const { scope = "user_data", includeCredentials = true } = req.body;
const exportData = await UserDataExport.exportUserData(userId, {
format: "encrypted",
scope,
includeCredentials,
});
const stats = UserDataExport.getExportStats(exportData);
res.json({
preview: true,
stats,
estimatedSize: JSON.stringify(exportData).length,
});
apiLogger.success("Export preview generated", {
operation: "export_preview_api_success",
userId,
totalRecords: stats.totalRecords,
});
} catch (error) {
apiLogger.error("Export preview failed", error, {
operation: "export_preview_api_failed",
});
res.status(500).json({
error: "Failed to generate export preview",
details: error instanceof Error ? error.message : "Unknown error",
});
}
});
app.post("/database/restore", requireAdmin, async (req, res) => {
try {
const { backupPath, targetPath } = req.body;
if (!backupPath) {
return res.status(400).json({ error: "Backup path is required" });
}
if (!DatabaseFileEncryption.isEncryptedDatabaseFile(backupPath)) {
return res.status(400).json({ error: "Invalid encrypted backup file" });
}
const restoredPath =
await DatabaseFileEncryption.restoreFromEncryptedBackup(
backupPath,
targetPath,
);
res.json({
success: true,
message: "Database restored successfully",
restoredPath,
});
} catch (error) {
apiLogger.error("Database restore failed", error, {
operation: "database_restore_api_failed",
});
res.status(500).json({
error: "Database restore failed",
details: error instanceof Error ? error.message : "Unknown error",
});
}
});
app.use("/users", userRoutes);
app.use("/ssh", sshRoutes);
app.use("/alerts", alertRoutes);
app.use("/credentials", credentialsRoutes);
app.use("/snippets", snippetsRoutes);
app.use(
(
err: unknown,
req: express.Request,
res: express.Response,
// eslint-disable-next-line @typescript-eslint/no-unused-vars
_next: express.NextFunction,
) => {
apiLogger.error("Unhandled error in request", err, {
operation: "error_handler",
method: req.method,
url: req.url,
userAgent: req.get("User-Agent"),
});
res.status(500).json({ error: "Internal Server Error" });
},
);
const HTTP_PORT = 30001;
async function initializeSecurity() {
try {
const authManager = AuthManager.getInstance();
await authManager.initialize();
DataCrypto.initialize();
const isValid = true;
if (!isValid) {
throw new Error("Security system validation failed");
}
} catch (error) {
databaseLogger.error("Failed to initialize security system", error, {
operation: "security_init_error",
});
throw error;
}
}
app.get(
"/database/migration/status",
authenticateJWT,
requireAdmin,
async (req, res) => {
try {
const dataDir = process.env.DATA_DIR || "./db/data";
const migration = new DatabaseMigration(dataDir);
const status = migration.checkMigrationStatus();
const dbPath = path.join(dataDir, "db.sqlite");
const encryptedDbPath = `${dbPath}.encrypted`;
const files = fs.readdirSync(dataDir);
const backupFiles = files.filter((f) => f.includes(".migration-backup-"));
const migratedFiles = files.filter((f) => f.includes(".migrated-"));
let unencryptedSize = 0;
let encryptedSize = 0;
if (status.hasUnencryptedDb) {
try {
unencryptedSize = fs.statSync(dbPath).size;
} catch {
// Ignore file access errors
}
}
if (status.hasEncryptedDb) {
try {
encryptedSize = fs.statSync(encryptedDbPath).size;
} catch {
// Ignore file access errors
}
}
res.json({
migrationStatus: status,
files: {
unencryptedDbSize: unencryptedSize,
encryptedDbSize: encryptedSize,
backupFiles: backupFiles.length,
migratedFiles: migratedFiles.length,
},
});
} catch (error) {
apiLogger.error("Failed to get migration status", error, {
operation: "migration_status_api_failed",
});
res.status(500).json({
error: "Failed to get migration status",
details: error instanceof Error ? error.message : "Unknown error",
});
}
},
);
app.get(
"/database/migration/history",
authenticateJWT,
requireAdmin,
async (req, res) => {
try {
const dataDir = process.env.DATA_DIR || "./db/data";
const files = fs.readdirSync(dataDir);
const backupFiles = files
.filter((f) => f.includes(".migration-backup-"))
.map((f) => {
const filePath = path.join(dataDir, f);
const stats = fs.statSync(filePath);
return {
name: f,
size: stats.size,
created: stats.birthtime,
modified: stats.mtime,
type: "backup",
};
})
.sort((a, b) => b.modified.getTime() - a.modified.getTime());
const migratedFiles = files
.filter((f) => f.includes(".migrated-"))
.map((f) => {
const filePath = path.join(dataDir, f);
const stats = fs.statSync(filePath);
return {
name: f,
size: stats.size,
created: stats.birthtime,
modified: stats.mtime,
type: "migrated",
};
})
.sort((a, b) => b.modified.getTime() - a.modified.getTime());
res.json({
files: [...backupFiles, ...migratedFiles],
summary: {
totalBackups: backupFiles.length,
totalMigrated: migratedFiles.length,
oldestBackup:
backupFiles.length > 0
? backupFiles[backupFiles.length - 1].created
: null,
newestBackup: backupFiles.length > 0 ? backupFiles[0].created : null,
},
});
} catch (error) {
apiLogger.error("Failed to get migration history", error, {
operation: "migration_history_api_failed",
});
res.status(500).json({
error: "Failed to get migration history",
details: error instanceof Error ? error.message : "Unknown error",
});
}
},
);
app.listen(HTTP_PORT, async () => {
const uploadsDir = path.join(process.cwd(), "uploads");
if (!fs.existsSync(uploadsDir)) {
fs.mkdirSync(uploadsDir, { recursive: true });
}
await initializeSecurity();
});
const sslConfig = AutoSSLSetup.getSSLConfig();
if (sslConfig.enabled) {
databaseLogger.info(`SSL is enabled`, {
operation: "ssl_info",
nginx_https_port: sslConfig.port,
backend_http_port: HTTP_PORT,
});
}