* fix: Resolve database encryption atomicity issues and enhance debugging (#430)

* fix: Resolve database encryption atomicity issues and enhance debugging

This commit addresses critical data corruption issues caused by non-atomic
file writes during database encryption, and adds comprehensive diagnostic
logging to help debug encryption-related failures.

**Problem:**
Users reported "Unsupported state or unable to authenticate data" errors
when starting the application after system crashes or Docker container
restarts. The root cause was non-atomic writes of encrypted database files:

1. Encrypted data file written (step 1)
2. Metadata file written (step 2)
→ If process crashes between steps 1 and 2, files become inconsistent
→ New IV/tag in data file, old IV/tag in metadata
→ GCM authentication fails on next startup
→ User data permanently inaccessible

**Solution - Atomic Writes:**

1. Write-to-temp + atomic-rename pattern:
   - Write to temporary files (*.tmp-timestamp-pid)
   - Perform atomic rename operations
   - Clean up temp files on failure

2. Data integrity validation:
   - Add dataSize field to metadata
   - Verify file size before decryption
   - Early detection of corrupted writes

3. Enhanced error diagnostics:
   - Key fingerprints (SHA256 prefix) for verification
   - File modification timestamps
   - Detailed GCM auth failure messages
   - Automatic diagnostic info generation

**Changes:**

database-file-encryption.ts:
- Implement atomic write pattern in encryptDatabaseFromBuffer
- Implement atomic write pattern in encryptDatabaseFile
- Add dataSize field to EncryptedFileMetadata interface
- Validate file size before decryption in decryptDatabaseToBuffer
- Enhanced error messages for GCM auth failures
- Add getDiagnosticInfo() function for comprehensive debugging
- Add debug logging for all encryption/decryption operations

system-crypto.ts:
- Add detailed logging for DATABASE_KEY initialization
- Log key source (env var vs .env file)
- Add key fingerprints to all log messages
- Better error messages when key loading fails

db/index.ts:
- Automatically generate diagnostic info on decryption failure
- Log detailed debugging information to help users troubleshoot

**Debugging Info Added:**

- Key initialization: source, fingerprint, length, path
- Encryption: original size, encrypted size, IV/tag prefixes, temp paths
- Decryption: file timestamps, metadata content, key fingerprint matching
- Auth failures: .env file status, key availability, file consistency
- File diagnostics: existence, readability, size validation, mtime comparison

**Backward Compatibility:**
- dataSize field is optional (metadata.dataSize?: number)
- Old encrypted files without dataSize continue to work
- No migration required

**Testing:**
- Compiled successfully
- No breaking changes to existing APIs
- Graceful handling of legacy v1 encrypted files

Fixes data loss issues reported by users experiencing container restarts
and system crashes during database saves.

* fix: Cleanup PR

* Update src/backend/utils/database-file-encryption.ts

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* Update src/backend/utils/database-file-encryption.ts

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* Update src/backend/utils/database-file-encryption.ts

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* Update src/backend/utils/database-file-encryption.ts

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* Update src/backend/utils/database-file-encryption.ts

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

---------

Co-authored-by: LukeGus <bugattiguy527@gmail.com>
Co-authored-by: Luke Gustafson <88517757+LukeGus@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* fix: Merge metadata and DB into 1 file

* fix: Add initial command palette

* Feature/german language support (#431)

* Update translation.json

Fixed some translation issues for German, made it more user friendly and common.

* Update translation.json

added updated block for serverStats

* Update translation.json

Added translations

* Update translation.json

Removed duplicate of "free":"Free"

* feat: Finalize command palette

* fix: Several bug fixes for terminals, server stats, and general feature improvements

* feat: Enhanced security, UI improvements, and animations (#432)

* fix: Remove empty catch blocks and add error logging

* refactor: Modularize server stats widget collectors

* feat: Add i18n support for terminal customization and login stats

- Add comprehensive terminal customization translations (60+ keys) for appearance, behavior, and advanced settings across all 4 languages
- Add SSH login statistics translations
- Update HostManagerEditor to use i18n for all terminal customization UI elements
- Update LoginStatsWidget to use i18n for all UI text
- Add missing logger imports in backend files for improved debugging

* feat: Add keyboard shortcut enhancements with Kbd component

- Add shadcn kbd component for displaying keyboard shortcuts
- Enhance file manager context menu to display shortcuts with Kbd component
- Add 5 new keyboard shortcuts to file manager:
  - Ctrl+D: Download selected files
  - Ctrl+N: Create new file
  - Ctrl+Shift+N: Create new folder
  - Ctrl+U: Upload files
  - Enter: Open/run selected file
- Add keyboard shortcut hints to command palette footer
- Create helper function to parse and render keyboard shortcuts

* feat: Add i18n support for command palette

- Add commandPalette translation section with 22 keys to all 4 languages
- Update CommandPalette component to use i18n for all UI text
- Translate search placeholder, group headings, menu items, and shortcut hints
- Support multilingual command palette interface

* feat: Add smooth transitions and animations to UI

- Add fade-in/fade-out transition to command palette (200ms)
- Add scale animation to command palette on open/close
- Add smooth popup animation to context menu (150ms)
- Add visual feedback for file selection with ring effect
- Add hover scale effect to file grid items
- Add transition-all to list view items for consistent behavior
- Zero JavaScript overhead, pure CSS transitions
- All animations under 200ms for instant feel

* feat: Add button active state and dashboard card animations

- Add active:scale-95 to all buttons for tactile click feedback
- Add hover border effect to dashboard cards (150ms transition)
- Add pulse animation to dashboard loading states
- Pure CSS transitions with zero JavaScript overhead
- Improves enterprise-level feel of UI

* feat: Add smooth macOS-style page transitions

- Add fullscreen crossfade transition for login/logout (300ms fade-out + 400ms fade-in)
- Add slide-in-from-right animation for all page switches (Dashboard, Terminal, SSH Manager, Admin, Profile)
- Fix TypeScript compilation by adding esModuleInterop to tsconfig.node.json
- Pass handleLogout from DesktopApp to LeftSidebar for consistent transition behavior

All page transitions now use Tailwind animate-in utilities with 300ms duration for smooth, native-feeling UX

* fix: Add key prop to force animation re-trigger on tab switch

Each page container now has key={currentTab} to ensure React unmounts and remounts the element on every tab switch, properly triggering the slide-in animation

* revert: Remove page transition animations

Page switching animations were not noticeable enough and felt unnecessary.
Keep only the login/logout fullscreen crossfade transitions which provide clear visual feedback for authentication state changes

* feat: Add ripple effect to login/logout transitions

Add three-layer expanding ripple animation during fadeOut phase:
- Ripples expand from screen center using primary theme color
- Each layer has staggered delay (0ms, 150ms, 300ms) for wave effect
- Ripples fade out as they expand to create elegant visual feedback
- Uses pure CSS keyframe animation, no external libraries

Total animation: 800ms ripple + 300ms screen fade

* feat: Add smooth TERMIX logo animation to transitions

Changes:
- Extend transition duration from 300ms/400ms to 800ms/600ms for more elegant feel
- Reduce ripple intensity from /20,/15,/10 to /8,/5 for subtlety
- Slow down ripple animation from 0.8s to 2s with cubic-bezier easing
- Add centered TERMIX logo with monospace font and subtitle
- Logo fades in from 80% scale, holds, then fades out at 110% scale
- Total effect: 1.2s logo animation synced with 2s ripple waves

Creates a premium, branded transition experience

* feat: Enhance transition animation with premium details

Timing adjustments:
- Extend fadeOut from 800ms to 1200ms
- Extend fadeIn from 600ms to 800ms
- Slow background fade to 700ms for elegance

Visual enhancements:
- Add 4-layer ripple waves (10%, 7%, 5%, 3% opacity) with staggered delays
- Ripple animation extended to 2.5s with refined opacity curve
- Logo blur effect: starts at 8px, sharpens to 0px, exits at 4px
- Logo glow effect: triple-layer text-shadow using primary theme color
- Increase logo size from text-6xl to text-7xl
- Subtitle delayed fade-in from bottom with smooth slide animation

Creates a cinematic, polished brand experience

* feat: Redesign login page with split-screen cinematic layout

Major redesign of authentication page:

Left Side (40% width):
- Full-height gradient background using primary theme color
- Large TERMIX logo with glow effect
- Subtitle and tagline
- Infinite animated ripple waves (3 layers)
- Hidden on mobile, shows brand identity

Right Side (60% width):
- Centered glassmorphism card with backdrop blur
- Refined tab switcher with pill-style active state
- Enlarged title with gradient text effect
- Added welcome subtitles for better UX
- Card slides in from bottom on load
- All existing functionality preserved

Visual enhancements:
- Tab navigation: segmented control style in muted container
- Active tab: white background with subtle shadow
- Smooth 200ms transitions on all interactions
- Card: rounded-2xl, shadow-xl, semi-transparent border

Creates premium, modern login experience matching transition animations

* feat: Update login page theme colors and add i18n support

- Changed login page gradient from blue to match dark theme colors
- Updated ripple effects to use theme primary color
- Added i18n translation keys for login page (auth.tagline, auth.description, auth.welcomeBack, auth.createAccount, auth.continueExternal)
- Updated all language files (en, zh, de, ru, pt-BR) with new translations
- Fixed TypeScript compilation issues by clearing build cache

* refactor: Use shadcn Tabs component and fix modal styling

- Replace custom tab navigation with shadcn Tabs component
- Restore border-2 border-dark-border for modal consistency
- Remove circular icon from login success message
- Simplify authentication success display

* refactor: Remove ripple effects and gradient from login page

- Remove animated ripple background effects
- Remove gradient background, use solid color (bg-dark-bg-darker)
- Remove text-shadow glow effect from logo
- Simplify brand showcase to clean, minimal design

* feat: Add decorative slash and remove subtitle from login page

- Add decorative slash divider with gradient lines below TERMIX logo
- Remove subtitle text (welcomeBack and createAccount)
- Simplify page title to show only the main heading

* feat: Add diagonal line pattern background to login page

- Replace decorative slash with subtle diagonal line pattern background
- Use repeating-linear-gradient at 45deg angle
- Set very low opacity (0.03) for subtle effect
- Pattern uses theme primary color

* fix: Display diagonal line pattern on login background

- Combine background color and pattern in single style attribute
- Use white semi-transparent lines (rgba 0.03 opacity)
- 45deg angle, 35px spacing, 2px width
- Remove separate overlay div to ensure pattern visibility

* security: Fix user enumeration vulnerability in login

- Unify error messages for invalid username and incorrect password
- Both return 401 status with 'Invalid username or password'
- Prevent attackers from enumerating valid usernames
- Maintain detailed logging for debugging purposes
- Changed from 404 'User not found' to generic auth failure message

* security: Add login rate limiting to prevent brute force attacks

- Implement LoginRateLimiter with IP and username-based tracking
- Block after 5 failed attempts within 15 minutes
- Lock account/IP for 15 minutes after threshold
- Automatic cleanup of expired entries every 5 minutes
- Track remaining attempts in logs for monitoring
- Return 429 status with remaining time on rate limit
- Reset counters on successful login
- Dual protection: both IP-based and username-based limits

* French translation (#434)

* Adding French Language

* Enhancements

* feat: Replace the old ssh tools system with a new dedicated sidebar

* fix: Merge zac/luke

* fix: Finalize new sidebar, improve and loading animations

* Added ability to close non-primary tabs involved in a split view (#435)

* fix: General bug fixes/small feature improvements

* feat: General UI improvements and translation updates

* fix: Command history and file manager styling issues

* feat: General bug fixes, added server stat commands, improved split screen, link accounts, etc

* fix: add Accept header for OIDC callback request (#436)

* Delete DOWNLOADS.md

* fix: add Accept header for OIDC callback request

---------

Co-authored-by: Luke Gustafson <88517757+LukeGus@users.noreply.github.com>

* fix: More bug fixes and QOL fixes

* fix: Server stats not respecting interval and fixed SSH toool type issues

* fix: Remove github links

* fix: Delete account spacing

* fix: Increment version

* fix: Unable to delete hosts and add nginx for terminal

* fix: Unable to delete hosts

* fix: Unable to delete hosts

* fix: Unable to delete hosts

* fix: OIDC/local account linking breaking both logins

* chore: File cleanup

* feat: Max terminal tab size and save current file manager sorting type

* fix: Terminal display issue, migrate host editor to use combobox

* feat: Add snippet folder/customization system

* fix: Fix OIDC linking and prep release

* fix: Increment version

---------

Co-authored-by: ZacharyZcR <zacharyzcr1984@gmail.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Max <herzmaximilian@gmail.com>
Co-authored-by: SlimGary <trash.slim@gmail.com>
Co-authored-by: jarrah31 <jarrah31@gmail.com>
Co-authored-by: Kf637 <mail@kf637.tech>
This commit was merged in pull request #437.
This commit is contained in:
Luke Gustafson
2025-11-17 09:46:05 -06:00
committed by GitHub
parent 38a59f3579
commit 8366c99b0f
104 changed files with 16070 additions and 2821 deletions

View File

@@ -6,7 +6,7 @@ import { Client as SSHClient } from "ssh2";
import { getDb } from "../database/db/index.js";
import { sshCredentials, sshData } from "../database/db/schema.js";
import { eq, and } from "drizzle-orm";
import { fileLogger } from "../utils/logger.js";
import { fileLogger, sshLogger } from "../utils/logger.js";
import { SimpleDBOps } from "../utils/simple-db-ops.js";
import { AuthManager } from "../utils/auth-manager.js";
import type { AuthenticatedRequest } from "../../types/index.js";
@@ -89,11 +89,179 @@ app.use(express.raw({ limit: "5gb", type: "application/octet-stream" }));
const authManager = AuthManager.getInstance();
app.use(authManager.createAuthMiddleware());
async function resolveJumpHost(
hostId: number,
userId: string,
): Promise<any | null> {
try {
const hosts = await SimpleDBOps.select(
getDb()
.select()
.from(sshData)
.where(and(eq(sshData.id, hostId), eq(sshData.userId, userId))),
"ssh_data",
userId,
);
if (hosts.length === 0) {
return null;
}
const host = hosts[0];
if (host.credentialId) {
const credentials = await SimpleDBOps.select(
getDb()
.select()
.from(sshCredentials)
.where(
and(
eq(sshCredentials.id, host.credentialId as number),
eq(sshCredentials.userId, userId),
),
),
"ssh_credentials",
userId,
);
if (credentials.length > 0) {
const credential = credentials[0];
return {
...host,
password: credential.password,
key:
credential.private_key || credential.privateKey || credential.key,
keyPassword: credential.key_password || credential.keyPassword,
keyType: credential.key_type || credential.keyType,
authType: credential.auth_type || credential.authType,
};
}
}
return host;
} catch (error) {
fileLogger.error("Failed to resolve jump host", error, {
operation: "resolve_jump_host",
hostId,
userId,
});
return null;
}
}
async function createJumpHostChain(
jumpHosts: Array<{ hostId: number }>,
userId: string,
): Promise<SSHClient | null> {
if (!jumpHosts || jumpHosts.length === 0) {
return null;
}
let currentClient: SSHClient | null = null;
const clients: SSHClient[] = [];
try {
for (let i = 0; i < jumpHosts.length; i++) {
const jumpHostConfig = await resolveJumpHost(jumpHosts[i].hostId, userId);
if (!jumpHostConfig) {
fileLogger.error(`Jump host ${i + 1} not found`, undefined, {
operation: "jump_host_chain",
hostId: jumpHosts[i].hostId,
});
clients.forEach((c) => c.end());
return null;
}
const jumpClient = new SSHClient();
clients.push(jumpClient);
const connected = await new Promise<boolean>((resolve) => {
const timeout = setTimeout(() => {
resolve(false);
}, 30000);
jumpClient.on("ready", () => {
clearTimeout(timeout);
resolve(true);
});
jumpClient.on("error", (err) => {
clearTimeout(timeout);
fileLogger.error(`Jump host ${i + 1} connection failed`, err, {
operation: "jump_host_connect",
hostId: jumpHostConfig.id,
ip: jumpHostConfig.ip,
});
resolve(false);
});
const connectConfig: any = {
host: jumpHostConfig.ip,
port: jumpHostConfig.port || 22,
username: jumpHostConfig.username,
tryKeyboard: true,
readyTimeout: 30000,
};
if (jumpHostConfig.authType === "password" && jumpHostConfig.password) {
connectConfig.password = jumpHostConfig.password;
} else if (jumpHostConfig.authType === "key" && jumpHostConfig.key) {
const cleanKey = jumpHostConfig.key
.trim()
.replace(/\r\n/g, "\n")
.replace(/\r/g, "\n");
connectConfig.privateKey = Buffer.from(cleanKey, "utf8");
if (jumpHostConfig.keyPassword) {
connectConfig.passphrase = jumpHostConfig.keyPassword;
}
}
if (currentClient) {
currentClient.forwardOut(
"127.0.0.1",
0,
jumpHostConfig.ip,
jumpHostConfig.port || 22,
(err, stream) => {
if (err) {
clearTimeout(timeout);
resolve(false);
return;
}
connectConfig.sock = stream;
jumpClient.connect(connectConfig);
},
);
} else {
jumpClient.connect(connectConfig);
}
});
if (!connected) {
clients.forEach((c) => c.end());
return null;
}
currentClient = jumpClient;
}
return currentClient;
} catch (error) {
fileLogger.error("Failed to create jump host chain", error, {
operation: "jump_host_chain",
});
clients.forEach((c) => c.end());
return null;
}
}
interface SSHSession {
client: SSHClient;
isConnected: boolean;
lastActive: number;
timeout?: NodeJS.Timeout;
activeOperations: number;
}
interface PendingTOTPSession {
@@ -118,9 +286,22 @@ const pendingTOTPSessions: Record<string, PendingTOTPSession> = {};
function cleanupSession(sessionId: string) {
const session = sshSessions[sessionId];
if (session) {
if (session.activeOperations > 0) {
fileLogger.warn(
`Deferring session cleanup for ${sessionId} - ${session.activeOperations} active operations`,
{
operation: "cleanup_deferred",
sessionId,
activeOperations: session.activeOperations,
},
);
scheduleSessionCleanup(sessionId);
return;
}
try {
session.client.end();
} catch {}
} catch (error) {}
clearTimeout(session.timeout);
delete sshSessions[sessionId];
}
@@ -174,6 +355,7 @@ app.post("/ssh/file_manager/ssh/connect", async (req, res) => {
credentialId,
userProvidedPassword,
forceKeyboardInteractive,
jumpHosts,
} = req.body;
const userId = (req as AuthenticatedRequest).userId;
@@ -393,6 +575,7 @@ app.post("/ssh/file_manager/ssh/connect", async (req, res) => {
client,
isConnected: true,
lastActive: Date.now(),
activeOperations: 0,
};
scheduleSessionCleanup(sessionId);
res.json({ status: "success", message: "SSH connection established" });
@@ -625,7 +808,52 @@ app.post("/ssh/file_manager/ssh/connect", async (req, res) => {
},
);
client.connect(config);
if (jumpHosts && jumpHosts.length > 0 && userId) {
try {
const jumpClient = await createJumpHostChain(jumpHosts, userId);
if (!jumpClient) {
fileLogger.error("Failed to establish jump host chain", {
operation: "file_jump_chain",
sessionId,
hostId,
});
return res
.status(500)
.json({ error: "Failed to connect through jump hosts" });
}
jumpClient.forwardOut("127.0.0.1", 0, ip, port, (err, stream) => {
if (err) {
fileLogger.error("Failed to forward through jump host", err, {
operation: "file_jump_forward",
sessionId,
hostId,
ip,
port,
});
jumpClient.end();
return res.status(500).json({
error: "Failed to forward through jump host: " + err.message,
});
}
config.sock = stream;
client.connect(config);
});
} catch (error) {
fileLogger.error("Jump host error", error, {
operation: "file_jump_host",
sessionId,
hostId,
});
return res
.status(500)
.json({ error: "Failed to connect through jump hosts" });
}
} else {
client.connect(config);
}
});
app.post("/ssh/file_manager/ssh/connect-totp", async (req, res) => {
@@ -663,7 +891,9 @@ app.post("/ssh/file_manager/ssh/connect-totp", async (req, res) => {
delete pendingTOTPSessions[sessionId];
try {
session.client.end();
} catch {}
} catch (error) {
sshLogger.debug("Operation failed, continuing", { error });
}
fileLogger.warn("TOTP session timeout before code submission", {
operation: "file_totp_verify",
sessionId,
@@ -700,6 +930,7 @@ app.post("/ssh/file_manager/ssh/connect-totp", async (req, res) => {
client: session.client,
isConnected: true,
lastActive: Date.now(),
activeOperations: 0,
};
scheduleSessionCleanup(sessionId);
@@ -843,10 +1074,12 @@ app.get("/ssh/file_manager/ssh/listFiles", (req, res) => {
}
sshConn.lastActive = Date.now();
sshConn.activeOperations++;
const escapedPath = sshPath.replace(/'/g, "'\"'\"'");
sshConn.client.exec(`ls -la '${escapedPath}'`, (err, stream) => {
sshConn.client.exec(`command ls -la '${escapedPath}'`, (err, stream) => {
if (err) {
sshConn.activeOperations--;
fileLogger.error("SSH listFiles error:", err);
return res.status(500).json({ error: err.message });
}
@@ -863,6 +1096,7 @@ app.get("/ssh/file_manager/ssh/listFiles", (req, res) => {
});
stream.on("close", (code) => {
sshConn.activeOperations--;
if (code !== 0) {
fileLogger.error(
`SSH listFiles command failed with code ${code}: ${errorData.replace(/\n/g, " ").trim()}`,
@@ -2486,6 +2720,516 @@ app.post("/ssh/file_manager/ssh/executeFile", async (req, res) => {
});
});
app.post("/ssh/file_manager/ssh/changePermissions", async (req, res) => {
const { sessionId, path, permissions } = req.body;
const sshConn = sshSessions[sessionId];
if (!sshConn || !sshConn.isConnected) {
fileLogger.error(
"SSH connection not found or not connected for changePermissions",
{
operation: "change_permissions",
sessionId,
hasConnection: !!sshConn,
isConnected: sshConn?.isConnected,
},
);
return res.status(400).json({ error: "SSH connection not available" });
}
if (!path) {
return res.status(400).json({ error: "File path is required" });
}
if (!permissions || !/^\d{3,4}$/.test(permissions)) {
return res.status(400).json({
error: "Valid permissions required (e.g., 755, 644)",
});
}
sshConn.lastActive = Date.now();
scheduleSessionCleanup(sessionId);
const octalPerms = permissions.slice(-3);
const escapedPath = path.replace(/'/g, "'\"'\"'");
const command = `chmod ${octalPerms} '${escapedPath}' && echo "SUCCESS"`;
fileLogger.info("Changing file permissions", {
operation: "change_permissions",
sessionId,
path,
permissions: octalPerms,
});
const commandTimeout = setTimeout(() => {
if (!res.headersSent) {
fileLogger.error("changePermissions command timeout", {
operation: "change_permissions",
sessionId,
path,
permissions: octalPerms,
});
res.status(408).json({
error: "Permission change timed out. SSH connection may be unstable.",
});
}
}, 10000);
sshConn.client.exec(command, (err, stream) => {
if (err) {
clearTimeout(commandTimeout);
fileLogger.error("SSH changePermissions exec error:", err, {
operation: "change_permissions",
sessionId,
path,
permissions: octalPerms,
});
if (!res.headersSent) {
return res.status(500).json({ error: "Failed to change permissions" });
}
return;
}
let outputData = "";
let errorOutput = "";
stream.on("data", (chunk: Buffer) => {
outputData += chunk.toString();
});
stream.stderr.on("data", (data: Buffer) => {
errorOutput += data.toString();
});
stream.on("close", (code) => {
clearTimeout(commandTimeout);
if (outputData.includes("SUCCESS")) {
fileLogger.success("File permissions changed successfully", {
operation: "change_permissions",
sessionId,
path,
permissions: octalPerms,
});
if (!res.headersSent) {
res.json({
success: true,
message: "Permissions changed successfully",
});
}
return;
}
if (code !== 0) {
fileLogger.error("chmod command failed", {
operation: "change_permissions",
sessionId,
path,
permissions: octalPerms,
exitCode: code,
error: errorOutput,
});
if (!res.headersSent) {
return res.status(500).json({
error: errorOutput || "Failed to change permissions",
});
}
return;
}
fileLogger.success("File permissions changed successfully", {
operation: "change_permissions",
sessionId,
path,
permissions: octalPerms,
});
if (!res.headersSent) {
res.json({
success: true,
message: "Permissions changed successfully",
});
}
});
stream.on("error", (streamErr) => {
clearTimeout(commandTimeout);
fileLogger.error("SSH changePermissions stream error:", streamErr, {
operation: "change_permissions",
sessionId,
path,
permissions: octalPerms,
});
if (!res.headersSent) {
res
.status(500)
.json({ error: "Stream error while changing permissions" });
}
});
});
});
// Route: Extract archive file (requires JWT)
// POST /ssh/file_manager/ssh/extractArchive
app.post("/ssh/file_manager/ssh/extractArchive", async (req, res) => {
const { sessionId, archivePath, extractPath } = req.body;
if (!sessionId || !archivePath) {
return res.status(400).json({ error: "Missing required parameters" });
}
const session = sshSessions[sessionId];
if (!session || !session.isConnected) {
return res.status(400).json({ error: "SSH session not connected" });
}
session.lastActive = Date.now();
scheduleSessionCleanup(sessionId);
const fileName = archivePath.split("/").pop() || "";
const fileExt = fileName.toLowerCase();
let extractCommand = "";
const targetPath =
extractPath || archivePath.substring(0, archivePath.lastIndexOf("/"));
if (fileExt.endsWith(".tar.gz") || fileExt.endsWith(".tgz")) {
extractCommand = `tar -xzf "${archivePath}" -C "${targetPath}"`;
} else if (fileExt.endsWith(".tar.bz2") || fileExt.endsWith(".tbz2")) {
extractCommand = `tar -xjf "${archivePath}" -C "${targetPath}"`;
} else if (fileExt.endsWith(".tar.xz")) {
extractCommand = `tar -xJf "${archivePath}" -C "${targetPath}"`;
} else if (fileExt.endsWith(".tar")) {
extractCommand = `tar -xf "${archivePath}" -C "${targetPath}"`;
} else if (fileExt.endsWith(".zip")) {
extractCommand = `unzip -o "${archivePath}" -d "${targetPath}"`;
} else if (fileExt.endsWith(".gz") && !fileExt.endsWith(".tar.gz")) {
extractCommand = `gunzip -c "${archivePath}" > "${archivePath.replace(/\.gz$/, "")}"`;
} else if (fileExt.endsWith(".bz2") && !fileExt.endsWith(".tar.bz2")) {
extractCommand = `bunzip2 -k "${archivePath}"`;
} else if (fileExt.endsWith(".xz") && !fileExt.endsWith(".tar.xz")) {
extractCommand = `unxz -k "${archivePath}"`;
} else if (fileExt.endsWith(".7z")) {
extractCommand = `7z x "${archivePath}" -o"${targetPath}"`;
} else if (fileExt.endsWith(".rar")) {
extractCommand = `unrar x "${archivePath}" "${targetPath}/"`;
} else {
return res.status(400).json({ error: "Unsupported archive format" });
}
fileLogger.info("Extracting archive", {
operation: "extract_archive",
sessionId,
archivePath,
extractPath: targetPath,
command: extractCommand,
});
session.client.exec(extractCommand, (err, stream) => {
if (err) {
fileLogger.error("SSH exec error during extract:", err, {
operation: "extract_archive",
sessionId,
archivePath,
});
return res
.status(500)
.json({ error: "Failed to execute extract command" });
}
let errorOutput = "";
stream.on("data", (data: Buffer) => {
fileLogger.debug("Extract stdout", {
operation: "extract_archive",
sessionId,
output: data.toString(),
});
});
stream.stderr.on("data", (data: Buffer) => {
errorOutput += data.toString();
fileLogger.debug("Extract stderr", {
operation: "extract_archive",
sessionId,
error: data.toString(),
});
});
stream.on("close", (code: number) => {
if (code !== 0) {
fileLogger.error("Extract command failed", {
operation: "extract_archive",
sessionId,
archivePath,
exitCode: code,
error: errorOutput,
});
let friendlyError = errorOutput || "Failed to extract archive";
if (
errorOutput.includes("command not found") ||
errorOutput.includes("not found")
) {
let missingCmd = "";
let installHint = "";
if (fileExt.endsWith(".zip")) {
missingCmd = "unzip";
installHint =
"apt install unzip / yum install unzip / brew install unzip";
} else if (
fileExt.endsWith(".tar.gz") ||
fileExt.endsWith(".tgz") ||
fileExt.endsWith(".tar.bz2") ||
fileExt.endsWith(".tbz2") ||
fileExt.endsWith(".tar.xz") ||
fileExt.endsWith(".tar")
) {
missingCmd = "tar";
installHint = "Usually pre-installed on Linux/Unix systems";
} else if (fileExt.endsWith(".gz")) {
missingCmd = "gunzip";
installHint =
"apt install gzip / yum install gzip / Usually pre-installed";
} else if (fileExt.endsWith(".bz2")) {
missingCmd = "bunzip2";
installHint =
"apt install bzip2 / yum install bzip2 / brew install bzip2";
} else if (fileExt.endsWith(".xz")) {
missingCmd = "unxz";
installHint =
"apt install xz-utils / yum install xz / brew install xz";
} else if (fileExt.endsWith(".7z")) {
missingCmd = "7z";
installHint =
"apt install p7zip-full / yum install p7zip / brew install p7zip";
} else if (fileExt.endsWith(".rar")) {
missingCmd = "unrar";
installHint =
"apt install unrar / yum install unrar / brew install unrar";
}
if (missingCmd) {
friendlyError = `Command '${missingCmd}' not found on remote server. Please install it first: ${installHint}`;
}
}
return res.status(500).json({ error: friendlyError });
}
fileLogger.success("Archive extracted successfully", {
operation: "extract_archive",
sessionId,
archivePath,
extractPath: targetPath,
});
res.json({
success: true,
message: "Archive extracted successfully",
extractPath: targetPath,
});
});
stream.on("error", (streamErr) => {
fileLogger.error("SSH extractArchive stream error:", streamErr, {
operation: "extract_archive",
sessionId,
archivePath,
});
if (!res.headersSent) {
res
.status(500)
.json({ error: "Stream error while extracting archive" });
}
});
});
});
// Route: Compress files/folders (requires JWT)
// POST /ssh/file_manager/ssh/compressFiles
app.post("/ssh/file_manager/ssh/compressFiles", async (req, res) => {
const { sessionId, paths, archiveName, format } = req.body;
if (
!sessionId ||
!paths ||
!Array.isArray(paths) ||
paths.length === 0 ||
!archiveName
) {
return res.status(400).json({ error: "Missing required parameters" });
}
const session = sshSessions[sessionId];
if (!session || !session.isConnected) {
return res.status(400).json({ error: "SSH session not connected" });
}
session.lastActive = Date.now();
scheduleSessionCleanup(sessionId);
const compressionFormat = format || "zip";
let compressCommand = "";
const firstPath = paths[0];
const workingDir = firstPath.substring(0, firstPath.lastIndexOf("/")) || "/";
const fileNames = paths
.map((p) => {
const name = p.split("/").pop();
return `"${name}"`;
})
.join(" ");
let archivePath = "";
if (archiveName.includes("/")) {
archivePath = archiveName;
} else {
archivePath = workingDir.endsWith("/")
? `${workingDir}${archiveName}`
: `${workingDir}/${archiveName}`;
}
if (compressionFormat === "zip") {
compressCommand = `cd "${workingDir}" && zip -r "${archivePath}" ${fileNames}`;
} else if (compressionFormat === "tar.gz" || compressionFormat === "tgz") {
compressCommand = `cd "${workingDir}" && tar -czf "${archivePath}" ${fileNames}`;
} else if (compressionFormat === "tar.bz2" || compressionFormat === "tbz2") {
compressCommand = `cd "${workingDir}" && tar -cjf "${archivePath}" ${fileNames}`;
} else if (compressionFormat === "tar.xz") {
compressCommand = `cd "${workingDir}" && tar -cJf "${archivePath}" ${fileNames}`;
} else if (compressionFormat === "tar") {
compressCommand = `cd "${workingDir}" && tar -cf "${archivePath}" ${fileNames}`;
} else if (compressionFormat === "7z") {
compressCommand = `cd "${workingDir}" && 7z a "${archivePath}" ${fileNames}`;
} else {
return res.status(400).json({ error: "Unsupported compression format" });
}
fileLogger.info("Compressing files", {
operation: "compress_files",
sessionId,
paths,
archivePath,
format: compressionFormat,
command: compressCommand,
});
session.client.exec(compressCommand, (err, stream) => {
if (err) {
fileLogger.error("SSH exec error during compress:", err, {
operation: "compress_files",
sessionId,
paths,
});
return res
.status(500)
.json({ error: "Failed to execute compress command" });
}
let errorOutput = "";
stream.on("data", (data: Buffer) => {
fileLogger.debug("Compress stdout", {
operation: "compress_files",
sessionId,
output: data.toString(),
});
});
stream.stderr.on("data", (data: Buffer) => {
errorOutput += data.toString();
fileLogger.debug("Compress stderr", {
operation: "compress_files",
sessionId,
error: data.toString(),
});
});
stream.on("close", (code: number) => {
if (code !== 0) {
fileLogger.error("Compress command failed", {
operation: "compress_files",
sessionId,
paths,
archivePath,
exitCode: code,
error: errorOutput,
});
let friendlyError = errorOutput || "Failed to compress files";
if (
errorOutput.includes("command not found") ||
errorOutput.includes("not found")
) {
const commandMap: Record<string, { cmd: string; install: string }> = {
zip: {
cmd: "zip",
install: "apt install zip / yum install zip / brew install zip",
},
"tar.gz": {
cmd: "tar",
install: "Usually pre-installed on Linux/Unix systems",
},
"tar.bz2": {
cmd: "tar",
install: "Usually pre-installed on Linux/Unix systems",
},
"tar.xz": {
cmd: "tar",
install: "Usually pre-installed on Linux/Unix systems",
},
tar: {
cmd: "tar",
install: "Usually pre-installed on Linux/Unix systems",
},
"7z": {
cmd: "7z",
install:
"apt install p7zip-full / yum install p7zip / brew install p7zip",
},
};
const info = commandMap[compressionFormat];
if (info) {
friendlyError = `Command '${info.cmd}' not found on remote server. Please install it first: ${info.install}`;
}
}
return res.status(500).json({ error: friendlyError });
}
fileLogger.success("Files compressed successfully", {
operation: "compress_files",
sessionId,
paths,
archivePath,
format: compressionFormat,
});
res.json({
success: true,
message: "Files compressed successfully",
archivePath: archivePath,
});
});
stream.on("error", (streamErr) => {
fileLogger.error("SSH compressFiles stream error:", streamErr, {
operation: "compress_files",
sessionId,
paths,
});
if (!res.headersSent) {
res.status(500).json({ error: "Stream error while compressing files" });
}
});
});
});
process.on("SIGINT", () => {
Object.keys(sshSessions).forEach(cleanupSession);
process.exit(0);