v1.9.0 (#437)
* fix: Resolve database encryption atomicity issues and enhance debugging (#430) * fix: Resolve database encryption atomicity issues and enhance debugging This commit addresses critical data corruption issues caused by non-atomic file writes during database encryption, and adds comprehensive diagnostic logging to help debug encryption-related failures. **Problem:** Users reported "Unsupported state or unable to authenticate data" errors when starting the application after system crashes or Docker container restarts. The root cause was non-atomic writes of encrypted database files: 1. Encrypted data file written (step 1) 2. Metadata file written (step 2) → If process crashes between steps 1 and 2, files become inconsistent → New IV/tag in data file, old IV/tag in metadata → GCM authentication fails on next startup → User data permanently inaccessible **Solution - Atomic Writes:** 1. Write-to-temp + atomic-rename pattern: - Write to temporary files (*.tmp-timestamp-pid) - Perform atomic rename operations - Clean up temp files on failure 2. Data integrity validation: - Add dataSize field to metadata - Verify file size before decryption - Early detection of corrupted writes 3. Enhanced error diagnostics: - Key fingerprints (SHA256 prefix) for verification - File modification timestamps - Detailed GCM auth failure messages - Automatic diagnostic info generation **Changes:** database-file-encryption.ts: - Implement atomic write pattern in encryptDatabaseFromBuffer - Implement atomic write pattern in encryptDatabaseFile - Add dataSize field to EncryptedFileMetadata interface - Validate file size before decryption in decryptDatabaseToBuffer - Enhanced error messages for GCM auth failures - Add getDiagnosticInfo() function for comprehensive debugging - Add debug logging for all encryption/decryption operations system-crypto.ts: - Add detailed logging for DATABASE_KEY initialization - Log key source (env var vs .env file) - Add key fingerprints to all log messages - Better error messages when key loading fails db/index.ts: - Automatically generate diagnostic info on decryption failure - Log detailed debugging information to help users troubleshoot **Debugging Info Added:** - Key initialization: source, fingerprint, length, path - Encryption: original size, encrypted size, IV/tag prefixes, temp paths - Decryption: file timestamps, metadata content, key fingerprint matching - Auth failures: .env file status, key availability, file consistency - File diagnostics: existence, readability, size validation, mtime comparison **Backward Compatibility:** - dataSize field is optional (metadata.dataSize?: number) - Old encrypted files without dataSize continue to work - No migration required **Testing:** - Compiled successfully - No breaking changes to existing APIs - Graceful handling of legacy v1 encrypted files Fixes data loss issues reported by users experiencing container restarts and system crashes during database saves. * fix: Cleanup PR * Update src/backend/utils/database-file-encryption.ts Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Update src/backend/utils/database-file-encryption.ts Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Update src/backend/utils/database-file-encryption.ts Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Update src/backend/utils/database-file-encryption.ts Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Update src/backend/utils/database-file-encryption.ts Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> --------- Co-authored-by: LukeGus <bugattiguy527@gmail.com> Co-authored-by: Luke Gustafson <88517757+LukeGus@users.noreply.github.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * fix: Merge metadata and DB into 1 file * fix: Add initial command palette * Feature/german language support (#431) * Update translation.json Fixed some translation issues for German, made it more user friendly and common. * Update translation.json added updated block for serverStats * Update translation.json Added translations * Update translation.json Removed duplicate of "free":"Free" * feat: Finalize command palette * fix: Several bug fixes for terminals, server stats, and general feature improvements * feat: Enhanced security, UI improvements, and animations (#432) * fix: Remove empty catch blocks and add error logging * refactor: Modularize server stats widget collectors * feat: Add i18n support for terminal customization and login stats - Add comprehensive terminal customization translations (60+ keys) for appearance, behavior, and advanced settings across all 4 languages - Add SSH login statistics translations - Update HostManagerEditor to use i18n for all terminal customization UI elements - Update LoginStatsWidget to use i18n for all UI text - Add missing logger imports in backend files for improved debugging * feat: Add keyboard shortcut enhancements with Kbd component - Add shadcn kbd component for displaying keyboard shortcuts - Enhance file manager context menu to display shortcuts with Kbd component - Add 5 new keyboard shortcuts to file manager: - Ctrl+D: Download selected files - Ctrl+N: Create new file - Ctrl+Shift+N: Create new folder - Ctrl+U: Upload files - Enter: Open/run selected file - Add keyboard shortcut hints to command palette footer - Create helper function to parse and render keyboard shortcuts * feat: Add i18n support for command palette - Add commandPalette translation section with 22 keys to all 4 languages - Update CommandPalette component to use i18n for all UI text - Translate search placeholder, group headings, menu items, and shortcut hints - Support multilingual command palette interface * feat: Add smooth transitions and animations to UI - Add fade-in/fade-out transition to command palette (200ms) - Add scale animation to command palette on open/close - Add smooth popup animation to context menu (150ms) - Add visual feedback for file selection with ring effect - Add hover scale effect to file grid items - Add transition-all to list view items for consistent behavior - Zero JavaScript overhead, pure CSS transitions - All animations under 200ms for instant feel * feat: Add button active state and dashboard card animations - Add active:scale-95 to all buttons for tactile click feedback - Add hover border effect to dashboard cards (150ms transition) - Add pulse animation to dashboard loading states - Pure CSS transitions with zero JavaScript overhead - Improves enterprise-level feel of UI * feat: Add smooth macOS-style page transitions - Add fullscreen crossfade transition for login/logout (300ms fade-out + 400ms fade-in) - Add slide-in-from-right animation for all page switches (Dashboard, Terminal, SSH Manager, Admin, Profile) - Fix TypeScript compilation by adding esModuleInterop to tsconfig.node.json - Pass handleLogout from DesktopApp to LeftSidebar for consistent transition behavior All page transitions now use Tailwind animate-in utilities with 300ms duration for smooth, native-feeling UX * fix: Add key prop to force animation re-trigger on tab switch Each page container now has key={currentTab} to ensure React unmounts and remounts the element on every tab switch, properly triggering the slide-in animation * revert: Remove page transition animations Page switching animations were not noticeable enough and felt unnecessary. Keep only the login/logout fullscreen crossfade transitions which provide clear visual feedback for authentication state changes * feat: Add ripple effect to login/logout transitions Add three-layer expanding ripple animation during fadeOut phase: - Ripples expand from screen center using primary theme color - Each layer has staggered delay (0ms, 150ms, 300ms) for wave effect - Ripples fade out as they expand to create elegant visual feedback - Uses pure CSS keyframe animation, no external libraries Total animation: 800ms ripple + 300ms screen fade * feat: Add smooth TERMIX logo animation to transitions Changes: - Extend transition duration from 300ms/400ms to 800ms/600ms for more elegant feel - Reduce ripple intensity from /20,/15,/10 to /8,/5 for subtlety - Slow down ripple animation from 0.8s to 2s with cubic-bezier easing - Add centered TERMIX logo with monospace font and subtitle - Logo fades in from 80% scale, holds, then fades out at 110% scale - Total effect: 1.2s logo animation synced with 2s ripple waves Creates a premium, branded transition experience * feat: Enhance transition animation with premium details Timing adjustments: - Extend fadeOut from 800ms to 1200ms - Extend fadeIn from 600ms to 800ms - Slow background fade to 700ms for elegance Visual enhancements: - Add 4-layer ripple waves (10%, 7%, 5%, 3% opacity) with staggered delays - Ripple animation extended to 2.5s with refined opacity curve - Logo blur effect: starts at 8px, sharpens to 0px, exits at 4px - Logo glow effect: triple-layer text-shadow using primary theme color - Increase logo size from text-6xl to text-7xl - Subtitle delayed fade-in from bottom with smooth slide animation Creates a cinematic, polished brand experience * feat: Redesign login page with split-screen cinematic layout Major redesign of authentication page: Left Side (40% width): - Full-height gradient background using primary theme color - Large TERMIX logo with glow effect - Subtitle and tagline - Infinite animated ripple waves (3 layers) - Hidden on mobile, shows brand identity Right Side (60% width): - Centered glassmorphism card with backdrop blur - Refined tab switcher with pill-style active state - Enlarged title with gradient text effect - Added welcome subtitles for better UX - Card slides in from bottom on load - All existing functionality preserved Visual enhancements: - Tab navigation: segmented control style in muted container - Active tab: white background with subtle shadow - Smooth 200ms transitions on all interactions - Card: rounded-2xl, shadow-xl, semi-transparent border Creates premium, modern login experience matching transition animations * feat: Update login page theme colors and add i18n support - Changed login page gradient from blue to match dark theme colors - Updated ripple effects to use theme primary color - Added i18n translation keys for login page (auth.tagline, auth.description, auth.welcomeBack, auth.createAccount, auth.continueExternal) - Updated all language files (en, zh, de, ru, pt-BR) with new translations - Fixed TypeScript compilation issues by clearing build cache * refactor: Use shadcn Tabs component and fix modal styling - Replace custom tab navigation with shadcn Tabs component - Restore border-2 border-dark-border for modal consistency - Remove circular icon from login success message - Simplify authentication success display * refactor: Remove ripple effects and gradient from login page - Remove animated ripple background effects - Remove gradient background, use solid color (bg-dark-bg-darker) - Remove text-shadow glow effect from logo - Simplify brand showcase to clean, minimal design * feat: Add decorative slash and remove subtitle from login page - Add decorative slash divider with gradient lines below TERMIX logo - Remove subtitle text (welcomeBack and createAccount) - Simplify page title to show only the main heading * feat: Add diagonal line pattern background to login page - Replace decorative slash with subtle diagonal line pattern background - Use repeating-linear-gradient at 45deg angle - Set very low opacity (0.03) for subtle effect - Pattern uses theme primary color * fix: Display diagonal line pattern on login background - Combine background color and pattern in single style attribute - Use white semi-transparent lines (rgba 0.03 opacity) - 45deg angle, 35px spacing, 2px width - Remove separate overlay div to ensure pattern visibility * security: Fix user enumeration vulnerability in login - Unify error messages for invalid username and incorrect password - Both return 401 status with 'Invalid username or password' - Prevent attackers from enumerating valid usernames - Maintain detailed logging for debugging purposes - Changed from 404 'User not found' to generic auth failure message * security: Add login rate limiting to prevent brute force attacks - Implement LoginRateLimiter with IP and username-based tracking - Block after 5 failed attempts within 15 minutes - Lock account/IP for 15 minutes after threshold - Automatic cleanup of expired entries every 5 minutes - Track remaining attempts in logs for monitoring - Return 429 status with remaining time on rate limit - Reset counters on successful login - Dual protection: both IP-based and username-based limits * French translation (#434) * Adding French Language * Enhancements * feat: Replace the old ssh tools system with a new dedicated sidebar * fix: Merge zac/luke * fix: Finalize new sidebar, improve and loading animations * Added ability to close non-primary tabs involved in a split view (#435) * fix: General bug fixes/small feature improvements * feat: General UI improvements and translation updates * fix: Command history and file manager styling issues * feat: General bug fixes, added server stat commands, improved split screen, link accounts, etc * fix: add Accept header for OIDC callback request (#436) * Delete DOWNLOADS.md * fix: add Accept header for OIDC callback request --------- Co-authored-by: Luke Gustafson <88517757+LukeGus@users.noreply.github.com> * fix: More bug fixes and QOL fixes * fix: Server stats not respecting interval and fixed SSH toool type issues * fix: Remove github links * fix: Delete account spacing * fix: Increment version * fix: Unable to delete hosts and add nginx for terminal * fix: Unable to delete hosts * fix: Unable to delete hosts * fix: Unable to delete hosts * fix: OIDC/local account linking breaking both logins * chore: File cleanup * feat: Max terminal tab size and save current file manager sorting type * fix: Terminal display issue, migrate host editor to use combobox * feat: Add snippet folder/customization system * fix: Fix OIDC linking and prep release * fix: Increment version --------- Co-authored-by: ZacharyZcR <zacharyzcr1984@gmail.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> Co-authored-by: Max <herzmaximilian@gmail.com> Co-authored-by: SlimGary <trash.slim@gmail.com> Co-authored-by: jarrah31 <jarrah31@gmail.com> Co-authored-by: Kf637 <mail@kf637.tech>
This commit was merged in pull request #437.
This commit is contained in:
@@ -8,6 +8,9 @@ import {
|
||||
fileManagerRecent,
|
||||
fileManagerPinned,
|
||||
fileManagerShortcuts,
|
||||
sshFolders,
|
||||
commandHistory,
|
||||
recentActivity,
|
||||
} from "../db/schema.js";
|
||||
import { eq, and, desc, isNotNull, or } from "drizzle-orm";
|
||||
import type { Request, Response } from "express";
|
||||
@@ -234,6 +237,8 @@ router.post(
|
||||
enableFileManager,
|
||||
defaultPath,
|
||||
tunnelConnections,
|
||||
jumpHosts,
|
||||
quickActions,
|
||||
statsConfig,
|
||||
terminalConfig,
|
||||
forceKeyboardInteractive,
|
||||
@@ -270,6 +275,10 @@ router.post(
|
||||
tunnelConnections: Array.isArray(tunnelConnections)
|
||||
? JSON.stringify(tunnelConnections)
|
||||
: null,
|
||||
jumpHosts: Array.isArray(jumpHosts) ? JSON.stringify(jumpHosts) : null,
|
||||
quickActions: Array.isArray(quickActions)
|
||||
? JSON.stringify(quickActions)
|
||||
: null,
|
||||
enableFileManager: enableFileManager ? 1 : 0,
|
||||
defaultPath: defaultPath || null,
|
||||
statsConfig: statsConfig ? JSON.stringify(statsConfig) : null,
|
||||
@@ -328,6 +337,9 @@ router.post(
|
||||
tunnelConnections: createdHost.tunnelConnections
|
||||
? JSON.parse(createdHost.tunnelConnections as string)
|
||||
: [],
|
||||
jumpHosts: createdHost.jumpHosts
|
||||
? JSON.parse(createdHost.jumpHosts as string)
|
||||
: [],
|
||||
enableFileManager: !!createdHost.enableFileManager,
|
||||
statsConfig: createdHost.statsConfig
|
||||
? JSON.parse(createdHost.statsConfig as string)
|
||||
@@ -349,6 +361,28 @@ router.post(
|
||||
},
|
||||
);
|
||||
|
||||
try {
|
||||
const axios = (await import("axios")).default;
|
||||
const statsPort = process.env.STATS_PORT || 30005;
|
||||
await axios.post(
|
||||
`http://localhost:${statsPort}/host-updated`,
|
||||
{ hostId: createdHost.id },
|
||||
{
|
||||
headers: {
|
||||
Authorization: req.headers.authorization || "",
|
||||
Cookie: req.headers.cookie || "",
|
||||
},
|
||||
timeout: 5000,
|
||||
},
|
||||
);
|
||||
} catch (err) {
|
||||
sshLogger.warn("Failed to notify stats server of new host", {
|
||||
operation: "host_create",
|
||||
hostId: createdHost.id as number,
|
||||
error: err instanceof Error ? err.message : String(err),
|
||||
});
|
||||
}
|
||||
|
||||
res.json(resolvedHost);
|
||||
} catch (err) {
|
||||
sshLogger.error("Failed to save SSH host to database", err, {
|
||||
@@ -369,6 +403,7 @@ router.post(
|
||||
router.put(
|
||||
"/db/host/:id",
|
||||
authenticateJWT,
|
||||
requireDataAccess,
|
||||
upload.single("key"),
|
||||
async (req: Request, res: Response) => {
|
||||
const hostId = req.params.id;
|
||||
@@ -424,6 +459,8 @@ router.put(
|
||||
enableFileManager,
|
||||
defaultPath,
|
||||
tunnelConnections,
|
||||
jumpHosts,
|
||||
quickActions,
|
||||
statsConfig,
|
||||
terminalConfig,
|
||||
forceKeyboardInteractive,
|
||||
@@ -461,6 +498,10 @@ router.put(
|
||||
tunnelConnections: Array.isArray(tunnelConnections)
|
||||
? JSON.stringify(tunnelConnections)
|
||||
: null,
|
||||
jumpHosts: Array.isArray(jumpHosts) ? JSON.stringify(jumpHosts) : null,
|
||||
quickActions: Array.isArray(quickActions)
|
||||
? JSON.stringify(quickActions)
|
||||
: null,
|
||||
enableFileManager: enableFileManager ? 1 : 0,
|
||||
defaultPath: defaultPath || null,
|
||||
statsConfig: statsConfig ? JSON.stringify(statsConfig) : null,
|
||||
@@ -537,6 +578,9 @@ router.put(
|
||||
tunnelConnections: updatedHost.tunnelConnections
|
||||
? JSON.parse(updatedHost.tunnelConnections as string)
|
||||
: [],
|
||||
jumpHosts: updatedHost.jumpHosts
|
||||
? JSON.parse(updatedHost.jumpHosts as string)
|
||||
: [],
|
||||
enableFileManager: !!updatedHost.enableFileManager,
|
||||
statsConfig: updatedHost.statsConfig
|
||||
? JSON.parse(updatedHost.statsConfig as string)
|
||||
@@ -558,6 +602,28 @@ router.put(
|
||||
},
|
||||
);
|
||||
|
||||
try {
|
||||
const axios = (await import("axios")).default;
|
||||
const statsPort = process.env.STATS_PORT || 30005;
|
||||
await axios.post(
|
||||
`http://localhost:${statsPort}/host-updated`,
|
||||
{ hostId: parseInt(hostId) },
|
||||
{
|
||||
headers: {
|
||||
Authorization: req.headers.authorization || "",
|
||||
Cookie: req.headers.cookie || "",
|
||||
},
|
||||
timeout: 5000,
|
||||
},
|
||||
);
|
||||
} catch (err) {
|
||||
sshLogger.warn("Failed to notify stats server of host update", {
|
||||
operation: "host_update",
|
||||
hostId: parseInt(hostId),
|
||||
error: err instanceof Error ? err.message : String(err),
|
||||
});
|
||||
}
|
||||
|
||||
res.json(resolvedHost);
|
||||
} catch (err) {
|
||||
sshLogger.error("Failed to update SSH host in database", err, {
|
||||
@@ -576,67 +642,77 @@ router.put(
|
||||
|
||||
// Route: Get SSH data for the authenticated user (requires JWT)
|
||||
// GET /ssh/host
|
||||
router.get("/db/host", authenticateJWT, async (req: Request, res: Response) => {
|
||||
const userId = (req as AuthenticatedRequest).userId;
|
||||
if (!isNonEmptyString(userId)) {
|
||||
sshLogger.warn("Invalid userId for SSH data fetch", {
|
||||
operation: "host_fetch",
|
||||
userId,
|
||||
});
|
||||
return res.status(400).json({ error: "Invalid userId" });
|
||||
}
|
||||
try {
|
||||
const data = await SimpleDBOps.select(
|
||||
db.select().from(sshData).where(eq(sshData.userId, userId)),
|
||||
"ssh_data",
|
||||
userId,
|
||||
);
|
||||
router.get(
|
||||
"/db/host",
|
||||
authenticateJWT,
|
||||
requireDataAccess,
|
||||
async (req: Request, res: Response) => {
|
||||
const userId = (req as AuthenticatedRequest).userId;
|
||||
if (!isNonEmptyString(userId)) {
|
||||
sshLogger.warn("Invalid userId for SSH data fetch", {
|
||||
operation: "host_fetch",
|
||||
userId,
|
||||
});
|
||||
return res.status(400).json({ error: "Invalid userId" });
|
||||
}
|
||||
try {
|
||||
const data = await SimpleDBOps.select(
|
||||
db.select().from(sshData).where(eq(sshData.userId, userId)),
|
||||
"ssh_data",
|
||||
userId,
|
||||
);
|
||||
|
||||
const result = await Promise.all(
|
||||
data.map(async (row: Record<string, unknown>) => {
|
||||
const baseHost = {
|
||||
...row,
|
||||
tags:
|
||||
typeof row.tags === "string"
|
||||
? row.tags
|
||||
? row.tags.split(",").filter(Boolean)
|
||||
: []
|
||||
const result = await Promise.all(
|
||||
data.map(async (row: Record<string, unknown>) => {
|
||||
const baseHost = {
|
||||
...row,
|
||||
tags:
|
||||
typeof row.tags === "string"
|
||||
? row.tags
|
||||
? row.tags.split(",").filter(Boolean)
|
||||
: []
|
||||
: [],
|
||||
pin: !!row.pin,
|
||||
enableTerminal: !!row.enableTerminal,
|
||||
enableTunnel: !!row.enableTunnel,
|
||||
tunnelConnections: row.tunnelConnections
|
||||
? JSON.parse(row.tunnelConnections as string)
|
||||
: [],
|
||||
pin: !!row.pin,
|
||||
enableTerminal: !!row.enableTerminal,
|
||||
enableTunnel: !!row.enableTunnel,
|
||||
tunnelConnections: row.tunnelConnections
|
||||
? JSON.parse(row.tunnelConnections as string)
|
||||
: [],
|
||||
enableFileManager: !!row.enableFileManager,
|
||||
statsConfig: row.statsConfig
|
||||
? JSON.parse(row.statsConfig as string)
|
||||
: undefined,
|
||||
terminalConfig: row.terminalConfig
|
||||
? JSON.parse(row.terminalConfig as string)
|
||||
: undefined,
|
||||
forceKeyboardInteractive: row.forceKeyboardInteractive === "true",
|
||||
};
|
||||
jumpHosts: row.jumpHosts ? JSON.parse(row.jumpHosts as string) : [],
|
||||
quickActions: row.quickActions
|
||||
? JSON.parse(row.quickActions as string)
|
||||
: [],
|
||||
enableFileManager: !!row.enableFileManager,
|
||||
statsConfig: row.statsConfig
|
||||
? JSON.parse(row.statsConfig as string)
|
||||
: undefined,
|
||||
terminalConfig: row.terminalConfig
|
||||
? JSON.parse(row.terminalConfig as string)
|
||||
: undefined,
|
||||
forceKeyboardInteractive: row.forceKeyboardInteractive === "true",
|
||||
};
|
||||
|
||||
return (await resolveHostCredentials(baseHost)) || baseHost;
|
||||
}),
|
||||
);
|
||||
return (await resolveHostCredentials(baseHost)) || baseHost;
|
||||
}),
|
||||
);
|
||||
|
||||
res.json(result);
|
||||
} catch (err) {
|
||||
sshLogger.error("Failed to fetch SSH hosts from database", err, {
|
||||
operation: "host_fetch",
|
||||
userId,
|
||||
});
|
||||
res.status(500).json({ error: "Failed to fetch SSH data" });
|
||||
}
|
||||
});
|
||||
res.json(result);
|
||||
} catch (err) {
|
||||
sshLogger.error("Failed to fetch SSH hosts from database", err, {
|
||||
operation: "host_fetch",
|
||||
userId,
|
||||
});
|
||||
res.status(500).json({ error: "Failed to fetch SSH data" });
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// Route: Get SSH host by ID (requires JWT)
|
||||
// GET /ssh/host/:id
|
||||
router.get(
|
||||
"/db/host/:id",
|
||||
authenticateJWT,
|
||||
requireDataAccess,
|
||||
async (req: Request, res: Response) => {
|
||||
const hostId = req.params.id;
|
||||
const userId = (req as AuthenticatedRequest).userId;
|
||||
@@ -679,6 +755,8 @@ router.get(
|
||||
tunnelConnections: host.tunnelConnections
|
||||
? JSON.parse(host.tunnelConnections)
|
||||
: [],
|
||||
jumpHosts: host.jumpHosts ? JSON.parse(host.jumpHosts) : [],
|
||||
quickActions: host.quickActions ? JSON.parse(host.quickActions) : [],
|
||||
enableFileManager: !!host.enableFileManager,
|
||||
statsConfig: host.statsConfig
|
||||
? JSON.parse(host.statsConfig)
|
||||
@@ -783,6 +861,7 @@ router.get(
|
||||
router.delete(
|
||||
"/db/host/:id",
|
||||
authenticateJWT,
|
||||
requireDataAccess,
|
||||
async (req: Request, res: Response) => {
|
||||
const userId = (req as AuthenticatedRequest).userId;
|
||||
const hostId = req.params.id;
|
||||
@@ -816,8 +895,8 @@ router.delete(
|
||||
.delete(fileManagerRecent)
|
||||
.where(
|
||||
and(
|
||||
eq(fileManagerRecent.userId, userId),
|
||||
eq(fileManagerRecent.hostId, numericHostId),
|
||||
eq(fileManagerRecent.userId, userId),
|
||||
),
|
||||
);
|
||||
|
||||
@@ -825,8 +904,8 @@ router.delete(
|
||||
.delete(fileManagerPinned)
|
||||
.where(
|
||||
and(
|
||||
eq(fileManagerPinned.userId, userId),
|
||||
eq(fileManagerPinned.hostId, numericHostId),
|
||||
eq(fileManagerPinned.userId, userId),
|
||||
),
|
||||
);
|
||||
|
||||
@@ -834,8 +913,17 @@ router.delete(
|
||||
.delete(fileManagerShortcuts)
|
||||
.where(
|
||||
and(
|
||||
eq(fileManagerShortcuts.userId, userId),
|
||||
eq(fileManagerShortcuts.hostId, numericHostId),
|
||||
eq(fileManagerShortcuts.userId, userId),
|
||||
),
|
||||
);
|
||||
|
||||
await db
|
||||
.delete(commandHistory)
|
||||
.where(
|
||||
and(
|
||||
eq(commandHistory.hostId, numericHostId),
|
||||
eq(commandHistory.userId, userId),
|
||||
),
|
||||
);
|
||||
|
||||
@@ -843,8 +931,17 @@ router.delete(
|
||||
.delete(sshCredentialUsage)
|
||||
.where(
|
||||
and(
|
||||
eq(sshCredentialUsage.userId, userId),
|
||||
eq(sshCredentialUsage.hostId, numericHostId),
|
||||
eq(sshCredentialUsage.userId, userId),
|
||||
),
|
||||
);
|
||||
|
||||
await db
|
||||
.delete(recentActivity)
|
||||
.where(
|
||||
and(
|
||||
eq(recentActivity.hostId, numericHostId),
|
||||
eq(recentActivity.userId, userId),
|
||||
),
|
||||
);
|
||||
|
||||
@@ -865,6 +962,28 @@ router.delete(
|
||||
},
|
||||
);
|
||||
|
||||
try {
|
||||
const axios = (await import("axios")).default;
|
||||
const statsPort = process.env.STATS_PORT || 30005;
|
||||
await axios.post(
|
||||
`http://localhost:${statsPort}/host-deleted`,
|
||||
{ hostId: numericHostId },
|
||||
{
|
||||
headers: {
|
||||
Authorization: req.headers.authorization || "",
|
||||
Cookie: req.headers.cookie || "",
|
||||
},
|
||||
timeout: 5000,
|
||||
},
|
||||
);
|
||||
} catch (err) {
|
||||
sshLogger.warn("Failed to notify stats server of host deletion", {
|
||||
operation: "host_delete",
|
||||
hostId: numericHostId,
|
||||
error: err instanceof Error ? err.message : String(err),
|
||||
});
|
||||
}
|
||||
|
||||
res.json({ message: "SSH host deleted" });
|
||||
} catch (err) {
|
||||
sshLogger.error("Failed to delete SSH host from database", err, {
|
||||
@@ -1241,6 +1360,94 @@ router.delete(
|
||||
},
|
||||
);
|
||||
|
||||
// Route: Get command history for a host
|
||||
// GET /ssh/command-history/:hostId
|
||||
router.get(
|
||||
"/command-history/:hostId",
|
||||
authenticateJWT,
|
||||
async (req: Request, res: Response) => {
|
||||
const userId = (req as AuthenticatedRequest).userId;
|
||||
const hostId = parseInt(req.params.hostId, 10);
|
||||
|
||||
if (!isNonEmptyString(userId) || !hostId) {
|
||||
sshLogger.warn("Invalid userId or hostId for command history fetch", {
|
||||
operation: "command_history_fetch",
|
||||
hostId,
|
||||
userId,
|
||||
});
|
||||
return res.status(400).json({ error: "Invalid userId or hostId" });
|
||||
}
|
||||
|
||||
try {
|
||||
const history = await db
|
||||
.select({
|
||||
id: commandHistory.id,
|
||||
command: commandHistory.command,
|
||||
})
|
||||
.from(commandHistory)
|
||||
.where(
|
||||
and(
|
||||
eq(commandHistory.userId, userId),
|
||||
eq(commandHistory.hostId, hostId),
|
||||
),
|
||||
)
|
||||
.orderBy(desc(commandHistory.executedAt))
|
||||
.limit(200);
|
||||
|
||||
res.json(history.map((h) => h.command));
|
||||
} catch (err) {
|
||||
sshLogger.error("Failed to fetch command history from database", err, {
|
||||
operation: "command_history_fetch",
|
||||
hostId,
|
||||
userId,
|
||||
});
|
||||
res.status(500).json({ error: "Failed to fetch command history" });
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// Route: Delete command from history
|
||||
// DELETE /ssh/command-history
|
||||
router.delete(
|
||||
"/command-history",
|
||||
authenticateJWT,
|
||||
async (req: Request, res: Response) => {
|
||||
const userId = (req as AuthenticatedRequest).userId;
|
||||
const { hostId, command } = req.body;
|
||||
|
||||
if (!isNonEmptyString(userId) || !hostId || !command) {
|
||||
sshLogger.warn("Invalid data for command history deletion", {
|
||||
operation: "command_history_delete",
|
||||
hostId,
|
||||
userId,
|
||||
});
|
||||
return res.status(400).json({ error: "Invalid data" });
|
||||
}
|
||||
|
||||
try {
|
||||
await db
|
||||
.delete(commandHistory)
|
||||
.where(
|
||||
and(
|
||||
eq(commandHistory.userId, userId),
|
||||
eq(commandHistory.hostId, hostId),
|
||||
eq(commandHistory.command, command),
|
||||
),
|
||||
);
|
||||
|
||||
res.json({ message: "Command deleted from history" });
|
||||
} catch (err) {
|
||||
sshLogger.error("Failed to delete command from history", err, {
|
||||
operation: "command_history_delete",
|
||||
hostId,
|
||||
userId,
|
||||
command,
|
||||
});
|
||||
res.status(500).json({ error: "Failed to delete command" });
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
async function resolveHostCredentials(
|
||||
host: Record<string, unknown>,
|
||||
): Promise<Record<string, unknown>> {
|
||||
@@ -1341,6 +1548,16 @@ router.put(
|
||||
|
||||
DatabaseSaveTrigger.triggerSave("folder_rename");
|
||||
|
||||
await db
|
||||
.update(sshFolders)
|
||||
.set({
|
||||
name: newName,
|
||||
updatedAt: new Date().toISOString(),
|
||||
})
|
||||
.where(
|
||||
and(eq(sshFolders.userId, userId), eq(sshFolders.name, oldName)),
|
||||
);
|
||||
|
||||
res.json({
|
||||
message: "Folder renamed successfully",
|
||||
updatedHosts: updatedHosts.length,
|
||||
@@ -1358,6 +1575,170 @@ router.put(
|
||||
},
|
||||
);
|
||||
|
||||
// Route: Get all folders with metadata (requires JWT)
|
||||
// GET /ssh/db/folders
|
||||
router.get("/folders", authenticateJWT, async (req: Request, res: Response) => {
|
||||
const userId = (req as AuthenticatedRequest).userId;
|
||||
|
||||
if (!isNonEmptyString(userId)) {
|
||||
return res.status(400).json({ error: "Invalid user ID" });
|
||||
}
|
||||
|
||||
try {
|
||||
const folders = await db
|
||||
.select()
|
||||
.from(sshFolders)
|
||||
.where(eq(sshFolders.userId, userId));
|
||||
|
||||
res.json(folders);
|
||||
} catch (err) {
|
||||
sshLogger.error("Failed to fetch folders", err, {
|
||||
operation: "fetch_folders",
|
||||
userId,
|
||||
});
|
||||
res.status(500).json({ error: "Failed to fetch folders" });
|
||||
}
|
||||
});
|
||||
|
||||
// Route: Update folder metadata (requires JWT)
|
||||
// PUT /ssh/db/folders/metadata
|
||||
router.put(
|
||||
"/folders/metadata",
|
||||
authenticateJWT,
|
||||
async (req: Request, res: Response) => {
|
||||
const userId = (req as AuthenticatedRequest).userId;
|
||||
const { name, color, icon } = req.body;
|
||||
|
||||
if (!isNonEmptyString(userId) || !name) {
|
||||
return res.status(400).json({ error: "Folder name is required" });
|
||||
}
|
||||
|
||||
try {
|
||||
const existing = await db
|
||||
.select()
|
||||
.from(sshFolders)
|
||||
.where(and(eq(sshFolders.userId, userId), eq(sshFolders.name, name)))
|
||||
.limit(1);
|
||||
|
||||
if (existing.length > 0) {
|
||||
await db
|
||||
.update(sshFolders)
|
||||
.set({
|
||||
color,
|
||||
icon,
|
||||
updatedAt: new Date().toISOString(),
|
||||
})
|
||||
.where(and(eq(sshFolders.userId, userId), eq(sshFolders.name, name)));
|
||||
} else {
|
||||
await db.insert(sshFolders).values({
|
||||
userId,
|
||||
name,
|
||||
color,
|
||||
icon,
|
||||
createdAt: new Date().toISOString(),
|
||||
updatedAt: new Date().toISOString(),
|
||||
});
|
||||
}
|
||||
|
||||
DatabaseSaveTrigger.triggerSave("folder_metadata_update");
|
||||
|
||||
res.json({ message: "Folder metadata updated successfully" });
|
||||
} catch (err) {
|
||||
sshLogger.error("Failed to update folder metadata", err, {
|
||||
operation: "update_folder_metadata",
|
||||
userId,
|
||||
name,
|
||||
});
|
||||
res.status(500).json({ error: "Failed to update folder metadata" });
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// Route: Delete all hosts in folder (requires JWT)
|
||||
// DELETE /ssh/db/folders/:name/hosts
|
||||
router.delete(
|
||||
"/folders/:name/hosts",
|
||||
authenticateJWT,
|
||||
async (req: Request, res: Response) => {
|
||||
const userId = (req as AuthenticatedRequest).userId;
|
||||
const folderName = req.params.name;
|
||||
|
||||
if (!isNonEmptyString(userId) || !folderName) {
|
||||
return res.status(400).json({ error: "Invalid folder name" });
|
||||
}
|
||||
|
||||
try {
|
||||
const hostsToDelete = await db
|
||||
.select()
|
||||
.from(sshData)
|
||||
.where(and(eq(sshData.userId, userId), eq(sshData.folder, folderName)));
|
||||
|
||||
if (hostsToDelete.length === 0) {
|
||||
return res.json({
|
||||
message: "No hosts found in folder",
|
||||
deletedCount: 0,
|
||||
});
|
||||
}
|
||||
|
||||
await db
|
||||
.delete(sshData)
|
||||
.where(and(eq(sshData.userId, userId), eq(sshData.folder, folderName)));
|
||||
|
||||
await db
|
||||
.delete(sshFolders)
|
||||
.where(
|
||||
and(eq(sshFolders.userId, userId), eq(sshFolders.name, folderName)),
|
||||
);
|
||||
|
||||
DatabaseSaveTrigger.triggerSave("folder_hosts_delete");
|
||||
|
||||
try {
|
||||
const axios = (await import("axios")).default;
|
||||
const statsPort = process.env.STATS_PORT || 30005;
|
||||
for (const host of hostsToDelete) {
|
||||
try {
|
||||
await axios.post(
|
||||
`http://localhost:${statsPort}/host-deleted`,
|
||||
{ hostId: host.id },
|
||||
{
|
||||
headers: {
|
||||
Authorization: req.headers.authorization || "",
|
||||
Cookie: req.headers.cookie || "",
|
||||
},
|
||||
timeout: 5000,
|
||||
},
|
||||
);
|
||||
} catch (err) {
|
||||
sshLogger.warn("Failed to notify stats server of host deletion", {
|
||||
operation: "folder_hosts_delete",
|
||||
hostId: host.id,
|
||||
error: err instanceof Error ? err.message : String(err),
|
||||
});
|
||||
}
|
||||
}
|
||||
} catch (err) {
|
||||
sshLogger.warn("Failed to notify stats server of folder deletion", {
|
||||
operation: "folder_hosts_delete",
|
||||
folderName,
|
||||
error: err instanceof Error ? err.message : String(err),
|
||||
});
|
||||
}
|
||||
|
||||
res.json({
|
||||
message: "All hosts in folder deleted successfully",
|
||||
deletedCount: hostsToDelete.length,
|
||||
});
|
||||
} catch (err) {
|
||||
sshLogger.error("Failed to delete hosts in folder", err, {
|
||||
operation: "delete_folder_hosts",
|
||||
userId,
|
||||
folderName,
|
||||
});
|
||||
res.status(500).json({ error: "Failed to delete hosts in folder" });
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// Route: Bulk import SSH hosts (requires JWT)
|
||||
// POST /ssh/bulk-import
|
||||
router.post(
|
||||
|
||||
Reference in New Issue
Block a user