* fix: Resolve database encryption atomicity issues and enhance debugging (#430)

* fix: Resolve database encryption atomicity issues and enhance debugging

This commit addresses critical data corruption issues caused by non-atomic
file writes during database encryption, and adds comprehensive diagnostic
logging to help debug encryption-related failures.

**Problem:**
Users reported "Unsupported state or unable to authenticate data" errors
when starting the application after system crashes or Docker container
restarts. The root cause was non-atomic writes of encrypted database files:

1. Encrypted data file written (step 1)
2. Metadata file written (step 2)
→ If process crashes between steps 1 and 2, files become inconsistent
→ New IV/tag in data file, old IV/tag in metadata
→ GCM authentication fails on next startup
→ User data permanently inaccessible

**Solution - Atomic Writes:**

1. Write-to-temp + atomic-rename pattern:
   - Write to temporary files (*.tmp-timestamp-pid)
   - Perform atomic rename operations
   - Clean up temp files on failure

2. Data integrity validation:
   - Add dataSize field to metadata
   - Verify file size before decryption
   - Early detection of corrupted writes

3. Enhanced error diagnostics:
   - Key fingerprints (SHA256 prefix) for verification
   - File modification timestamps
   - Detailed GCM auth failure messages
   - Automatic diagnostic info generation

**Changes:**

database-file-encryption.ts:
- Implement atomic write pattern in encryptDatabaseFromBuffer
- Implement atomic write pattern in encryptDatabaseFile
- Add dataSize field to EncryptedFileMetadata interface
- Validate file size before decryption in decryptDatabaseToBuffer
- Enhanced error messages for GCM auth failures
- Add getDiagnosticInfo() function for comprehensive debugging
- Add debug logging for all encryption/decryption operations

system-crypto.ts:
- Add detailed logging for DATABASE_KEY initialization
- Log key source (env var vs .env file)
- Add key fingerprints to all log messages
- Better error messages when key loading fails

db/index.ts:
- Automatically generate diagnostic info on decryption failure
- Log detailed debugging information to help users troubleshoot

**Debugging Info Added:**

- Key initialization: source, fingerprint, length, path
- Encryption: original size, encrypted size, IV/tag prefixes, temp paths
- Decryption: file timestamps, metadata content, key fingerprint matching
- Auth failures: .env file status, key availability, file consistency
- File diagnostics: existence, readability, size validation, mtime comparison

**Backward Compatibility:**
- dataSize field is optional (metadata.dataSize?: number)
- Old encrypted files without dataSize continue to work
- No migration required

**Testing:**
- Compiled successfully
- No breaking changes to existing APIs
- Graceful handling of legacy v1 encrypted files

Fixes data loss issues reported by users experiencing container restarts
and system crashes during database saves.

* fix: Cleanup PR

* Update src/backend/utils/database-file-encryption.ts

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* Update src/backend/utils/database-file-encryption.ts

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* Update src/backend/utils/database-file-encryption.ts

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* Update src/backend/utils/database-file-encryption.ts

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* Update src/backend/utils/database-file-encryption.ts

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

---------

Co-authored-by: LukeGus <bugattiguy527@gmail.com>
Co-authored-by: Luke Gustafson <88517757+LukeGus@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* fix: Merge metadata and DB into 1 file

* fix: Add initial command palette

* Feature/german language support (#431)

* Update translation.json

Fixed some translation issues for German, made it more user friendly and common.

* Update translation.json

added updated block for serverStats

* Update translation.json

Added translations

* Update translation.json

Removed duplicate of "free":"Free"

* feat: Finalize command palette

* fix: Several bug fixes for terminals, server stats, and general feature improvements

* feat: Enhanced security, UI improvements, and animations (#432)

* fix: Remove empty catch blocks and add error logging

* refactor: Modularize server stats widget collectors

* feat: Add i18n support for terminal customization and login stats

- Add comprehensive terminal customization translations (60+ keys) for appearance, behavior, and advanced settings across all 4 languages
- Add SSH login statistics translations
- Update HostManagerEditor to use i18n for all terminal customization UI elements
- Update LoginStatsWidget to use i18n for all UI text
- Add missing logger imports in backend files for improved debugging

* feat: Add keyboard shortcut enhancements with Kbd component

- Add shadcn kbd component for displaying keyboard shortcuts
- Enhance file manager context menu to display shortcuts with Kbd component
- Add 5 new keyboard shortcuts to file manager:
  - Ctrl+D: Download selected files
  - Ctrl+N: Create new file
  - Ctrl+Shift+N: Create new folder
  - Ctrl+U: Upload files
  - Enter: Open/run selected file
- Add keyboard shortcut hints to command palette footer
- Create helper function to parse and render keyboard shortcuts

* feat: Add i18n support for command palette

- Add commandPalette translation section with 22 keys to all 4 languages
- Update CommandPalette component to use i18n for all UI text
- Translate search placeholder, group headings, menu items, and shortcut hints
- Support multilingual command palette interface

* feat: Add smooth transitions and animations to UI

- Add fade-in/fade-out transition to command palette (200ms)
- Add scale animation to command palette on open/close
- Add smooth popup animation to context menu (150ms)
- Add visual feedback for file selection with ring effect
- Add hover scale effect to file grid items
- Add transition-all to list view items for consistent behavior
- Zero JavaScript overhead, pure CSS transitions
- All animations under 200ms for instant feel

* feat: Add button active state and dashboard card animations

- Add active:scale-95 to all buttons for tactile click feedback
- Add hover border effect to dashboard cards (150ms transition)
- Add pulse animation to dashboard loading states
- Pure CSS transitions with zero JavaScript overhead
- Improves enterprise-level feel of UI

* feat: Add smooth macOS-style page transitions

- Add fullscreen crossfade transition for login/logout (300ms fade-out + 400ms fade-in)
- Add slide-in-from-right animation for all page switches (Dashboard, Terminal, SSH Manager, Admin, Profile)
- Fix TypeScript compilation by adding esModuleInterop to tsconfig.node.json
- Pass handleLogout from DesktopApp to LeftSidebar for consistent transition behavior

All page transitions now use Tailwind animate-in utilities with 300ms duration for smooth, native-feeling UX

* fix: Add key prop to force animation re-trigger on tab switch

Each page container now has key={currentTab} to ensure React unmounts and remounts the element on every tab switch, properly triggering the slide-in animation

* revert: Remove page transition animations

Page switching animations were not noticeable enough and felt unnecessary.
Keep only the login/logout fullscreen crossfade transitions which provide clear visual feedback for authentication state changes

* feat: Add ripple effect to login/logout transitions

Add three-layer expanding ripple animation during fadeOut phase:
- Ripples expand from screen center using primary theme color
- Each layer has staggered delay (0ms, 150ms, 300ms) for wave effect
- Ripples fade out as they expand to create elegant visual feedback
- Uses pure CSS keyframe animation, no external libraries

Total animation: 800ms ripple + 300ms screen fade

* feat: Add smooth TERMIX logo animation to transitions

Changes:
- Extend transition duration from 300ms/400ms to 800ms/600ms for more elegant feel
- Reduce ripple intensity from /20,/15,/10 to /8,/5 for subtlety
- Slow down ripple animation from 0.8s to 2s with cubic-bezier easing
- Add centered TERMIX logo with monospace font and subtitle
- Logo fades in from 80% scale, holds, then fades out at 110% scale
- Total effect: 1.2s logo animation synced with 2s ripple waves

Creates a premium, branded transition experience

* feat: Enhance transition animation with premium details

Timing adjustments:
- Extend fadeOut from 800ms to 1200ms
- Extend fadeIn from 600ms to 800ms
- Slow background fade to 700ms for elegance

Visual enhancements:
- Add 4-layer ripple waves (10%, 7%, 5%, 3% opacity) with staggered delays
- Ripple animation extended to 2.5s with refined opacity curve
- Logo blur effect: starts at 8px, sharpens to 0px, exits at 4px
- Logo glow effect: triple-layer text-shadow using primary theme color
- Increase logo size from text-6xl to text-7xl
- Subtitle delayed fade-in from bottom with smooth slide animation

Creates a cinematic, polished brand experience

* feat: Redesign login page with split-screen cinematic layout

Major redesign of authentication page:

Left Side (40% width):
- Full-height gradient background using primary theme color
- Large TERMIX logo with glow effect
- Subtitle and tagline
- Infinite animated ripple waves (3 layers)
- Hidden on mobile, shows brand identity

Right Side (60% width):
- Centered glassmorphism card with backdrop blur
- Refined tab switcher with pill-style active state
- Enlarged title with gradient text effect
- Added welcome subtitles for better UX
- Card slides in from bottom on load
- All existing functionality preserved

Visual enhancements:
- Tab navigation: segmented control style in muted container
- Active tab: white background with subtle shadow
- Smooth 200ms transitions on all interactions
- Card: rounded-2xl, shadow-xl, semi-transparent border

Creates premium, modern login experience matching transition animations

* feat: Update login page theme colors and add i18n support

- Changed login page gradient from blue to match dark theme colors
- Updated ripple effects to use theme primary color
- Added i18n translation keys for login page (auth.tagline, auth.description, auth.welcomeBack, auth.createAccount, auth.continueExternal)
- Updated all language files (en, zh, de, ru, pt-BR) with new translations
- Fixed TypeScript compilation issues by clearing build cache

* refactor: Use shadcn Tabs component and fix modal styling

- Replace custom tab navigation with shadcn Tabs component
- Restore border-2 border-dark-border for modal consistency
- Remove circular icon from login success message
- Simplify authentication success display

* refactor: Remove ripple effects and gradient from login page

- Remove animated ripple background effects
- Remove gradient background, use solid color (bg-dark-bg-darker)
- Remove text-shadow glow effect from logo
- Simplify brand showcase to clean, minimal design

* feat: Add decorative slash and remove subtitle from login page

- Add decorative slash divider with gradient lines below TERMIX logo
- Remove subtitle text (welcomeBack and createAccount)
- Simplify page title to show only the main heading

* feat: Add diagonal line pattern background to login page

- Replace decorative slash with subtle diagonal line pattern background
- Use repeating-linear-gradient at 45deg angle
- Set very low opacity (0.03) for subtle effect
- Pattern uses theme primary color

* fix: Display diagonal line pattern on login background

- Combine background color and pattern in single style attribute
- Use white semi-transparent lines (rgba 0.03 opacity)
- 45deg angle, 35px spacing, 2px width
- Remove separate overlay div to ensure pattern visibility

* security: Fix user enumeration vulnerability in login

- Unify error messages for invalid username and incorrect password
- Both return 401 status with 'Invalid username or password'
- Prevent attackers from enumerating valid usernames
- Maintain detailed logging for debugging purposes
- Changed from 404 'User not found' to generic auth failure message

* security: Add login rate limiting to prevent brute force attacks

- Implement LoginRateLimiter with IP and username-based tracking
- Block after 5 failed attempts within 15 minutes
- Lock account/IP for 15 minutes after threshold
- Automatic cleanup of expired entries every 5 minutes
- Track remaining attempts in logs for monitoring
- Return 429 status with remaining time on rate limit
- Reset counters on successful login
- Dual protection: both IP-based and username-based limits

* French translation (#434)

* Adding French Language

* Enhancements

* feat: Replace the old ssh tools system with a new dedicated sidebar

* fix: Merge zac/luke

* fix: Finalize new sidebar, improve and loading animations

* Added ability to close non-primary tabs involved in a split view (#435)

* fix: General bug fixes/small feature improvements

* feat: General UI improvements and translation updates

* fix: Command history and file manager styling issues

* feat: General bug fixes, added server stat commands, improved split screen, link accounts, etc

* fix: add Accept header for OIDC callback request (#436)

* Delete DOWNLOADS.md

* fix: add Accept header for OIDC callback request

---------

Co-authored-by: Luke Gustafson <88517757+LukeGus@users.noreply.github.com>

* fix: More bug fixes and QOL fixes

* fix: Server stats not respecting interval and fixed SSH toool type issues

* fix: Remove github links

* fix: Delete account spacing

* fix: Increment version

* fix: Unable to delete hosts and add nginx for terminal

* fix: Unable to delete hosts

* fix: Unable to delete hosts

* fix: Unable to delete hosts

* fix: OIDC/local account linking breaking both logins

* chore: File cleanup

* feat: Max terminal tab size and save current file manager sorting type

* fix: Terminal display issue, migrate host editor to use combobox

* feat: Add snippet folder/customization system

* fix: Fix OIDC linking and prep release

* fix: Increment version

---------

Co-authored-by: ZacharyZcR <zacharyzcr1984@gmail.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Max <herzmaximilian@gmail.com>
Co-authored-by: SlimGary <trash.slim@gmail.com>
Co-authored-by: jarrah31 <jarrah31@gmail.com>
Co-authored-by: Kf637 <mail@kf637.tech>
This commit was merged in pull request #437.
This commit is contained in:
Luke Gustafson
2025-11-17 09:46:05 -06:00
committed by GitHub
parent 38a59f3579
commit 8366c99b0f
104 changed files with 16070 additions and 2821 deletions

View File

@@ -1,8 +1,8 @@
import type { AuthenticatedRequest } from "../../../types/index.js";
import express from "express";
import { db } from "../db/index.js";
import { snippets } from "../db/schema.js";
import { eq, and, desc, sql } from "drizzle-orm";
import { snippets, snippetFolders } from "../db/schema.js";
import { eq, and, desc, asc, sql } from "drizzle-orm";
import type { Request, Response } from "express";
import { authLogger } from "../../utils/logger.js";
import { AuthManager } from "../../utils/auth-manager.js";
@@ -17,6 +17,651 @@ const authManager = AuthManager.getInstance();
const authenticateJWT = authManager.createAuthMiddleware();
const requireDataAccess = authManager.createDataAccessMiddleware();
// Get all snippet folders
// GET /snippets/folders
router.get(
"/folders",
authenticateJWT,
requireDataAccess,
async (req: Request, res: Response) => {
const userId = (req as AuthenticatedRequest).userId;
if (!isNonEmptyString(userId)) {
authLogger.warn("Invalid userId for snippet folders fetch");
return res.status(400).json({ error: "Invalid userId" });
}
try {
const result = await db
.select()
.from(snippetFolders)
.where(eq(snippetFolders.userId, userId))
.orderBy(asc(snippetFolders.name));
res.json(result);
} catch (err) {
authLogger.error("Failed to fetch snippet folders", err);
res.status(500).json({ error: "Failed to fetch snippet folders" });
}
},
);
// Create a new snippet folder
// POST /snippets/folders
router.post(
"/folders",
authenticateJWT,
requireDataAccess,
async (req: Request, res: Response) => {
const userId = (req as AuthenticatedRequest).userId;
const { name, color, icon } = req.body;
if (!isNonEmptyString(userId) || !isNonEmptyString(name)) {
authLogger.warn("Invalid snippet folder creation data", {
operation: "snippet_folder_create",
userId,
hasName: !!name,
});
return res.status(400).json({ error: "Folder name is required" });
}
try {
const existing = await db
.select()
.from(snippetFolders)
.where(
and(eq(snippetFolders.userId, userId), eq(snippetFolders.name, name)),
);
if (existing.length > 0) {
return res
.status(409)
.json({ error: "Folder with this name already exists" });
}
const insertData = {
userId,
name: name.trim(),
color: color?.trim() || null,
icon: icon?.trim() || null,
};
const result = await db
.insert(snippetFolders)
.values(insertData)
.returning();
authLogger.success(`Snippet folder created: ${name} by user ${userId}`, {
operation: "snippet_folder_create_success",
userId,
name,
});
res.status(201).json(result[0]);
} catch (err) {
authLogger.error("Failed to create snippet folder", err);
res.status(500).json({
error:
err instanceof Error
? err.message
: "Failed to create snippet folder",
});
}
},
);
// Update snippet folder metadata (color, icon)
// PUT /snippets/folders/:name/metadata
router.put(
"/folders/:name/metadata",
authenticateJWT,
requireDataAccess,
async (req: Request, res: Response) => {
const userId = (req as AuthenticatedRequest).userId;
const { name } = req.params;
const { color, icon } = req.body;
if (!isNonEmptyString(userId) || !name) {
authLogger.warn("Invalid request for snippet folder metadata update");
return res.status(400).json({ error: "Invalid request" });
}
try {
const existing = await db
.select()
.from(snippetFolders)
.where(
and(
eq(snippetFolders.userId, userId),
eq(snippetFolders.name, decodeURIComponent(name)),
),
);
if (existing.length === 0) {
return res.status(404).json({ error: "Folder not found" });
}
const updateFields: Partial<{
color: string | null;
icon: string | null;
updatedAt: ReturnType<typeof sql.raw>;
}> = {
updatedAt: sql`CURRENT_TIMESTAMP`,
};
if (color !== undefined) updateFields.color = color?.trim() || null;
if (icon !== undefined) updateFields.icon = icon?.trim() || null;
await db
.update(snippetFolders)
.set(updateFields)
.where(
and(
eq(snippetFolders.userId, userId),
eq(snippetFolders.name, decodeURIComponent(name)),
),
);
const updated = await db
.select()
.from(snippetFolders)
.where(
and(
eq(snippetFolders.userId, userId),
eq(snippetFolders.name, decodeURIComponent(name)),
),
);
authLogger.success(
`Snippet folder metadata updated: ${name} by user ${userId}`,
{
operation: "snippet_folder_metadata_update_success",
userId,
name,
},
);
res.json(updated[0]);
} catch (err) {
authLogger.error("Failed to update snippet folder metadata", err);
res.status(500).json({
error:
err instanceof Error
? err.message
: "Failed to update snippet folder metadata",
});
}
},
);
// Rename snippet folder
// PUT /snippets/folders/rename
router.put(
"/folders/rename",
authenticateJWT,
requireDataAccess,
async (req: Request, res: Response) => {
const userId = (req as AuthenticatedRequest).userId;
const { oldName, newName } = req.body;
if (
!isNonEmptyString(userId) ||
!isNonEmptyString(oldName) ||
!isNonEmptyString(newName)
) {
authLogger.warn("Invalid request for snippet folder rename");
return res.status(400).json({ error: "Invalid request" });
}
try {
const existing = await db
.select()
.from(snippetFolders)
.where(
and(
eq(snippetFolders.userId, userId),
eq(snippetFolders.name, oldName),
),
);
if (existing.length === 0) {
return res.status(404).json({ error: "Folder not found" });
}
const nameExists = await db
.select()
.from(snippetFolders)
.where(
and(
eq(snippetFolders.userId, userId),
eq(snippetFolders.name, newName),
),
);
if (nameExists.length > 0) {
return res
.status(409)
.json({ error: "Folder with new name already exists" });
}
await db
.update(snippetFolders)
.set({ name: newName, updatedAt: sql`CURRENT_TIMESTAMP` })
.where(
and(
eq(snippetFolders.userId, userId),
eq(snippetFolders.name, oldName),
),
);
await db
.update(snippets)
.set({ folder: newName })
.where(and(eq(snippets.userId, userId), eq(snippets.folder, oldName)));
authLogger.success(
`Snippet folder renamed: ${oldName} -> ${newName} by user ${userId}`,
{
operation: "snippet_folder_rename_success",
userId,
oldName,
newName,
},
);
res.json({ success: true, oldName, newName });
} catch (err) {
authLogger.error("Failed to rename snippet folder", err);
res.status(500).json({
error:
err instanceof Error
? err.message
: "Failed to rename snippet folder",
});
}
},
);
// Delete snippet folder
// DELETE /snippets/folders/:name
router.delete(
"/folders/:name",
authenticateJWT,
requireDataAccess,
async (req: Request, res: Response) => {
const userId = (req as AuthenticatedRequest).userId;
const { name } = req.params;
if (!isNonEmptyString(userId) || !name) {
authLogger.warn("Invalid request for snippet folder delete");
return res.status(400).json({ error: "Invalid request" });
}
try {
const folderName = decodeURIComponent(name);
await db
.update(snippets)
.set({ folder: null })
.where(
and(eq(snippets.userId, userId), eq(snippets.folder, folderName)),
);
await db
.delete(snippetFolders)
.where(
and(
eq(snippetFolders.userId, userId),
eq(snippetFolders.name, folderName),
),
);
authLogger.success(
`Snippet folder deleted: ${folderName} by user ${userId}`,
{
operation: "snippet_folder_delete_success",
userId,
name: folderName,
},
);
res.json({ success: true });
} catch (err) {
authLogger.error("Failed to delete snippet folder", err);
res.status(500).json({
error:
err instanceof Error
? err.message
: "Failed to delete snippet folder",
});
}
},
);
// Reorder snippets (bulk update)
// PUT /snippets/reorder
router.put(
"/reorder",
authenticateJWT,
requireDataAccess,
async (req: Request, res: Response) => {
const userId = (req as AuthenticatedRequest).userId;
const { snippets: snippetUpdates } = req.body;
if (!isNonEmptyString(userId)) {
authLogger.warn("Invalid userId for snippet reorder");
return res.status(400).json({ error: "Invalid userId" });
}
if (!Array.isArray(snippetUpdates) || snippetUpdates.length === 0) {
authLogger.warn("Invalid snippet reorder data", {
operation: "snippet_reorder",
userId,
});
return res
.status(400)
.json({ error: "snippets array is required and must not be empty" });
}
try {
for (const update of snippetUpdates) {
const { id, order, folder } = update;
if (!id || order === undefined) {
continue;
}
const updateFields: Partial<{
order: number;
folder: string | null;
}> = {
order,
};
if (folder !== undefined) {
updateFields.folder = folder?.trim() || null;
}
await db
.update(snippets)
.set(updateFields)
.where(and(eq(snippets.id, id), eq(snippets.userId, userId)));
}
authLogger.success(`Snippets reordered by user ${userId}`, {
operation: "snippet_reorder_success",
userId,
count: snippetUpdates.length,
});
res.json({ success: true, updated: snippetUpdates.length });
} catch (err) {
authLogger.error("Failed to reorder snippets", err);
res.status(500).json({
error:
err instanceof Error ? err.message : "Failed to reorder snippets",
});
}
},
);
// Execute a snippet on a host
// POST /snippets/execute
router.post(
"/execute",
authenticateJWT,
requireDataAccess,
async (req: Request, res: Response) => {
const userId = (req as AuthenticatedRequest).userId;
const { snippetId, hostId } = req.body;
if (!isNonEmptyString(userId) || !snippetId || !hostId) {
authLogger.warn("Invalid snippet execution request", {
userId,
snippetId,
hostId,
});
return res
.status(400)
.json({ error: "Snippet ID and Host ID are required" });
}
try {
const snippetResult = await db
.select()
.from(snippets)
.where(
and(
eq(snippets.id, parseInt(snippetId)),
eq(snippets.userId, userId),
),
);
if (snippetResult.length === 0) {
return res.status(404).json({ error: "Snippet not found" });
}
const snippet = snippetResult[0];
const { Client } = await import("ssh2");
const { sshData, sshCredentials } = await import("../db/schema.js");
const { SimpleDBOps } = await import("../../utils/simple-db-ops.js");
const hostResult = await SimpleDBOps.select(
db
.select()
.from(sshData)
.where(
and(eq(sshData.id, parseInt(hostId)), eq(sshData.userId, userId)),
),
"ssh_data",
userId,
);
if (hostResult.length === 0) {
return res.status(404).json({ error: "Host not found" });
}
const host = hostResult[0];
let password = host.password;
let privateKey = host.key;
let passphrase = host.key_password;
let authType = host.authType;
if (host.credentialId) {
const credResult = await SimpleDBOps.select(
db
.select()
.from(sshCredentials)
.where(
and(
eq(sshCredentials.id, host.credentialId as number),
eq(sshCredentials.userId, userId),
),
),
"ssh_credentials",
userId,
);
if (credResult.length > 0) {
const cred = credResult[0];
authType = (cred.auth_type || cred.authType || authType) as string;
password = (cred.password || undefined) as string | undefined;
privateKey = (cred.private_key || cred.key || undefined) as
| string
| undefined;
passphrase = (cred.key_password || undefined) as string | undefined;
}
}
const conn = new Client();
let output = "";
let errorOutput = "";
const executePromise = new Promise<{
success: boolean;
output: string;
error?: string;
}>((resolve, reject) => {
const timeout = setTimeout(() => {
conn.end();
reject(new Error("Command execution timeout (30s)"));
}, 30000);
conn.on("ready", () => {
conn.exec(snippet.content, (err, stream) => {
if (err) {
clearTimeout(timeout);
conn.end();
return reject(err);
}
stream.on("close", () => {
clearTimeout(timeout);
conn.end();
if (errorOutput) {
resolve({ success: false, output, error: errorOutput });
} else {
resolve({ success: true, output });
}
});
stream.on("data", (data: Buffer) => {
output += data.toString();
});
stream.stderr.on("data", (data: Buffer) => {
errorOutput += data.toString();
});
});
});
conn.on("error", (err) => {
clearTimeout(timeout);
reject(err);
});
const config: any = {
host: host.ip,
port: host.port,
username: host.username,
tryKeyboard: true,
keepaliveInterval: 30000,
keepaliveCountMax: 3,
readyTimeout: 30000,
tcpKeepAlive: true,
tcpKeepAliveInitialDelay: 30000,
timeout: 30000,
env: {
TERM: "xterm-256color",
LANG: "en_US.UTF-8",
LC_ALL: "en_US.UTF-8",
LC_CTYPE: "en_US.UTF-8",
LC_MESSAGES: "en_US.UTF-8",
LC_MONETARY: "en_US.UTF-8",
LC_NUMERIC: "en_US.UTF-8",
LC_TIME: "en_US.UTF-8",
LC_COLLATE: "en_US.UTF-8",
COLORTERM: "truecolor",
},
algorithms: {
kex: [
"curve25519-sha256",
"curve25519-sha256@libssh.org",
"ecdh-sha2-nistp521",
"ecdh-sha2-nistp384",
"ecdh-sha2-nistp256",
"diffie-hellman-group-exchange-sha256",
"diffie-hellman-group14-sha256",
"diffie-hellman-group14-sha1",
"diffie-hellman-group-exchange-sha1",
"diffie-hellman-group1-sha1",
],
serverHostKey: [
"ssh-ed25519",
"ecdsa-sha2-nistp521",
"ecdsa-sha2-nistp384",
"ecdsa-sha2-nistp256",
"rsa-sha2-512",
"rsa-sha2-256",
"ssh-rsa",
"ssh-dss",
],
cipher: [
"chacha20-poly1305@openssh.com",
"aes256-gcm@openssh.com",
"aes128-gcm@openssh.com",
"aes256-ctr",
"aes192-ctr",
"aes128-ctr",
"aes256-cbc",
"aes192-cbc",
"aes128-cbc",
"3des-cbc",
],
hmac: [
"hmac-sha2-512-etm@openssh.com",
"hmac-sha2-256-etm@openssh.com",
"hmac-sha2-512",
"hmac-sha2-256",
"hmac-sha1",
"hmac-md5",
],
compress: ["none", "zlib@openssh.com", "zlib"],
},
};
if (authType === "password" && password) {
config.password = password;
} else if (authType === "key" && privateKey) {
const cleanKey = (privateKey as string)
.trim()
.replace(/\r\n/g, "\n")
.replace(/\r/g, "\n");
config.privateKey = Buffer.from(cleanKey, "utf8");
if (passphrase) {
config.passphrase = passphrase;
}
} else if (password) {
config.password = password;
} else if (privateKey) {
const cleanKey = (privateKey as string)
.trim()
.replace(/\r\n/g, "\n")
.replace(/\r/g, "\n");
config.privateKey = Buffer.from(cleanKey, "utf8");
if (passphrase) {
config.passphrase = passphrase;
}
}
conn.connect(config);
});
const result = await executePromise;
authLogger.success(
`Snippet executed: ${snippet.name} on host ${hostId}`,
{
operation: "snippet_execute_success",
userId,
snippetId,
hostId,
},
);
res.json(result);
} catch (err) {
authLogger.error("Failed to execute snippet", err);
res.status(500).json({
error: err instanceof Error ? err.message : "Failed to execute snippet",
});
}
},
);
// Get all snippets for the authenticated user
// GET /snippets
router.get(
@@ -36,7 +681,12 @@ router.get(
.select()
.from(snippets)
.where(eq(snippets.userId, userId))
.orderBy(desc(snippets.updatedAt));
.orderBy(
sql`CASE WHEN ${snippets.folder} IS NULL OR ${snippets.folder} = '' THEN 0 ELSE 1 END`,
asc(snippets.folder),
asc(snippets.order),
desc(snippets.updatedAt),
);
res.json(result);
} catch (err) {
@@ -93,7 +743,7 @@ router.post(
requireDataAccess,
async (req: Request, res: Response) => {
const userId = (req as AuthenticatedRequest).userId;
const { name, content, description } = req.body;
const { name, content, description, folder, order } = req.body;
if (
!isNonEmptyString(userId) ||
@@ -110,11 +760,31 @@ router.post(
}
try {
let snippetOrder = order;
if (snippetOrder === undefined || snippetOrder === null) {
const folderValue = folder?.trim() || "";
const maxOrderResult = await db
.select({ maxOrder: sql<number>`MAX(${snippets.order})` })
.from(snippets)
.where(
and(
eq(snippets.userId, userId),
folderValue
? eq(snippets.folder, folderValue)
: sql`(${snippets.folder} IS NULL OR ${snippets.folder} = '')`,
),
);
const maxOrder = maxOrderResult[0]?.maxOrder ?? -1;
snippetOrder = maxOrder + 1;
}
const insertData = {
userId,
name: name.trim(),
content: content.trim(),
description: description?.trim() || null,
folder: folder?.trim() || null,
order: snippetOrder,
};
const result = await db.insert(snippets).values(insertData).returning();
@@ -167,6 +837,8 @@ router.put(
name: string;
content: string;
description: string | null;
folder: string | null;
order: number;
}> = {
updatedAt: sql`CURRENT_TIMESTAMP`,
};
@@ -177,6 +849,9 @@ router.put(
updateFields.content = updateData.content.trim();
if (updateData.description !== undefined)
updateFields.description = updateData.description?.trim() || null;
if (updateData.folder !== undefined)
updateFields.folder = updateData.folder?.trim() || null;
if (updateData.order !== undefined) updateFields.order = updateData.order;
await db
.update(snippets)