v1.9.0 (#437)
* fix: Resolve database encryption atomicity issues and enhance debugging (#430) * fix: Resolve database encryption atomicity issues and enhance debugging This commit addresses critical data corruption issues caused by non-atomic file writes during database encryption, and adds comprehensive diagnostic logging to help debug encryption-related failures. **Problem:** Users reported "Unsupported state or unable to authenticate data" errors when starting the application after system crashes or Docker container restarts. The root cause was non-atomic writes of encrypted database files: 1. Encrypted data file written (step 1) 2. Metadata file written (step 2) → If process crashes between steps 1 and 2, files become inconsistent → New IV/tag in data file, old IV/tag in metadata → GCM authentication fails on next startup → User data permanently inaccessible **Solution - Atomic Writes:** 1. Write-to-temp + atomic-rename pattern: - Write to temporary files (*.tmp-timestamp-pid) - Perform atomic rename operations - Clean up temp files on failure 2. Data integrity validation: - Add dataSize field to metadata - Verify file size before decryption - Early detection of corrupted writes 3. Enhanced error diagnostics: - Key fingerprints (SHA256 prefix) for verification - File modification timestamps - Detailed GCM auth failure messages - Automatic diagnostic info generation **Changes:** database-file-encryption.ts: - Implement atomic write pattern in encryptDatabaseFromBuffer - Implement atomic write pattern in encryptDatabaseFile - Add dataSize field to EncryptedFileMetadata interface - Validate file size before decryption in decryptDatabaseToBuffer - Enhanced error messages for GCM auth failures - Add getDiagnosticInfo() function for comprehensive debugging - Add debug logging for all encryption/decryption operations system-crypto.ts: - Add detailed logging for DATABASE_KEY initialization - Log key source (env var vs .env file) - Add key fingerprints to all log messages - Better error messages when key loading fails db/index.ts: - Automatically generate diagnostic info on decryption failure - Log detailed debugging information to help users troubleshoot **Debugging Info Added:** - Key initialization: source, fingerprint, length, path - Encryption: original size, encrypted size, IV/tag prefixes, temp paths - Decryption: file timestamps, metadata content, key fingerprint matching - Auth failures: .env file status, key availability, file consistency - File diagnostics: existence, readability, size validation, mtime comparison **Backward Compatibility:** - dataSize field is optional (metadata.dataSize?: number) - Old encrypted files without dataSize continue to work - No migration required **Testing:** - Compiled successfully - No breaking changes to existing APIs - Graceful handling of legacy v1 encrypted files Fixes data loss issues reported by users experiencing container restarts and system crashes during database saves. * fix: Cleanup PR * Update src/backend/utils/database-file-encryption.ts Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Update src/backend/utils/database-file-encryption.ts Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Update src/backend/utils/database-file-encryption.ts Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Update src/backend/utils/database-file-encryption.ts Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Update src/backend/utils/database-file-encryption.ts Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> --------- Co-authored-by: LukeGus <bugattiguy527@gmail.com> Co-authored-by: Luke Gustafson <88517757+LukeGus@users.noreply.github.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * fix: Merge metadata and DB into 1 file * fix: Add initial command palette * Feature/german language support (#431) * Update translation.json Fixed some translation issues for German, made it more user friendly and common. * Update translation.json added updated block for serverStats * Update translation.json Added translations * Update translation.json Removed duplicate of "free":"Free" * feat: Finalize command palette * fix: Several bug fixes for terminals, server stats, and general feature improvements * feat: Enhanced security, UI improvements, and animations (#432) * fix: Remove empty catch blocks and add error logging * refactor: Modularize server stats widget collectors * feat: Add i18n support for terminal customization and login stats - Add comprehensive terminal customization translations (60+ keys) for appearance, behavior, and advanced settings across all 4 languages - Add SSH login statistics translations - Update HostManagerEditor to use i18n for all terminal customization UI elements - Update LoginStatsWidget to use i18n for all UI text - Add missing logger imports in backend files for improved debugging * feat: Add keyboard shortcut enhancements with Kbd component - Add shadcn kbd component for displaying keyboard shortcuts - Enhance file manager context menu to display shortcuts with Kbd component - Add 5 new keyboard shortcuts to file manager: - Ctrl+D: Download selected files - Ctrl+N: Create new file - Ctrl+Shift+N: Create new folder - Ctrl+U: Upload files - Enter: Open/run selected file - Add keyboard shortcut hints to command palette footer - Create helper function to parse and render keyboard shortcuts * feat: Add i18n support for command palette - Add commandPalette translation section with 22 keys to all 4 languages - Update CommandPalette component to use i18n for all UI text - Translate search placeholder, group headings, menu items, and shortcut hints - Support multilingual command palette interface * feat: Add smooth transitions and animations to UI - Add fade-in/fade-out transition to command palette (200ms) - Add scale animation to command palette on open/close - Add smooth popup animation to context menu (150ms) - Add visual feedback for file selection with ring effect - Add hover scale effect to file grid items - Add transition-all to list view items for consistent behavior - Zero JavaScript overhead, pure CSS transitions - All animations under 200ms for instant feel * feat: Add button active state and dashboard card animations - Add active:scale-95 to all buttons for tactile click feedback - Add hover border effect to dashboard cards (150ms transition) - Add pulse animation to dashboard loading states - Pure CSS transitions with zero JavaScript overhead - Improves enterprise-level feel of UI * feat: Add smooth macOS-style page transitions - Add fullscreen crossfade transition for login/logout (300ms fade-out + 400ms fade-in) - Add slide-in-from-right animation for all page switches (Dashboard, Terminal, SSH Manager, Admin, Profile) - Fix TypeScript compilation by adding esModuleInterop to tsconfig.node.json - Pass handleLogout from DesktopApp to LeftSidebar for consistent transition behavior All page transitions now use Tailwind animate-in utilities with 300ms duration for smooth, native-feeling UX * fix: Add key prop to force animation re-trigger on tab switch Each page container now has key={currentTab} to ensure React unmounts and remounts the element on every tab switch, properly triggering the slide-in animation * revert: Remove page transition animations Page switching animations were not noticeable enough and felt unnecessary. Keep only the login/logout fullscreen crossfade transitions which provide clear visual feedback for authentication state changes * feat: Add ripple effect to login/logout transitions Add three-layer expanding ripple animation during fadeOut phase: - Ripples expand from screen center using primary theme color - Each layer has staggered delay (0ms, 150ms, 300ms) for wave effect - Ripples fade out as they expand to create elegant visual feedback - Uses pure CSS keyframe animation, no external libraries Total animation: 800ms ripple + 300ms screen fade * feat: Add smooth TERMIX logo animation to transitions Changes: - Extend transition duration from 300ms/400ms to 800ms/600ms for more elegant feel - Reduce ripple intensity from /20,/15,/10 to /8,/5 for subtlety - Slow down ripple animation from 0.8s to 2s with cubic-bezier easing - Add centered TERMIX logo with monospace font and subtitle - Logo fades in from 80% scale, holds, then fades out at 110% scale - Total effect: 1.2s logo animation synced with 2s ripple waves Creates a premium, branded transition experience * feat: Enhance transition animation with premium details Timing adjustments: - Extend fadeOut from 800ms to 1200ms - Extend fadeIn from 600ms to 800ms - Slow background fade to 700ms for elegance Visual enhancements: - Add 4-layer ripple waves (10%, 7%, 5%, 3% opacity) with staggered delays - Ripple animation extended to 2.5s with refined opacity curve - Logo blur effect: starts at 8px, sharpens to 0px, exits at 4px - Logo glow effect: triple-layer text-shadow using primary theme color - Increase logo size from text-6xl to text-7xl - Subtitle delayed fade-in from bottom with smooth slide animation Creates a cinematic, polished brand experience * feat: Redesign login page with split-screen cinematic layout Major redesign of authentication page: Left Side (40% width): - Full-height gradient background using primary theme color - Large TERMIX logo with glow effect - Subtitle and tagline - Infinite animated ripple waves (3 layers) - Hidden on mobile, shows brand identity Right Side (60% width): - Centered glassmorphism card with backdrop blur - Refined tab switcher with pill-style active state - Enlarged title with gradient text effect - Added welcome subtitles for better UX - Card slides in from bottom on load - All existing functionality preserved Visual enhancements: - Tab navigation: segmented control style in muted container - Active tab: white background with subtle shadow - Smooth 200ms transitions on all interactions - Card: rounded-2xl, shadow-xl, semi-transparent border Creates premium, modern login experience matching transition animations * feat: Update login page theme colors and add i18n support - Changed login page gradient from blue to match dark theme colors - Updated ripple effects to use theme primary color - Added i18n translation keys for login page (auth.tagline, auth.description, auth.welcomeBack, auth.createAccount, auth.continueExternal) - Updated all language files (en, zh, de, ru, pt-BR) with new translations - Fixed TypeScript compilation issues by clearing build cache * refactor: Use shadcn Tabs component and fix modal styling - Replace custom tab navigation with shadcn Tabs component - Restore border-2 border-dark-border for modal consistency - Remove circular icon from login success message - Simplify authentication success display * refactor: Remove ripple effects and gradient from login page - Remove animated ripple background effects - Remove gradient background, use solid color (bg-dark-bg-darker) - Remove text-shadow glow effect from logo - Simplify brand showcase to clean, minimal design * feat: Add decorative slash and remove subtitle from login page - Add decorative slash divider with gradient lines below TERMIX logo - Remove subtitle text (welcomeBack and createAccount) - Simplify page title to show only the main heading * feat: Add diagonal line pattern background to login page - Replace decorative slash with subtle diagonal line pattern background - Use repeating-linear-gradient at 45deg angle - Set very low opacity (0.03) for subtle effect - Pattern uses theme primary color * fix: Display diagonal line pattern on login background - Combine background color and pattern in single style attribute - Use white semi-transparent lines (rgba 0.03 opacity) - 45deg angle, 35px spacing, 2px width - Remove separate overlay div to ensure pattern visibility * security: Fix user enumeration vulnerability in login - Unify error messages for invalid username and incorrect password - Both return 401 status with 'Invalid username or password' - Prevent attackers from enumerating valid usernames - Maintain detailed logging for debugging purposes - Changed from 404 'User not found' to generic auth failure message * security: Add login rate limiting to prevent brute force attacks - Implement LoginRateLimiter with IP and username-based tracking - Block after 5 failed attempts within 15 minutes - Lock account/IP for 15 minutes after threshold - Automatic cleanup of expired entries every 5 minutes - Track remaining attempts in logs for monitoring - Return 429 status with remaining time on rate limit - Reset counters on successful login - Dual protection: both IP-based and username-based limits * French translation (#434) * Adding French Language * Enhancements * feat: Replace the old ssh tools system with a new dedicated sidebar * fix: Merge zac/luke * fix: Finalize new sidebar, improve and loading animations * Added ability to close non-primary tabs involved in a split view (#435) * fix: General bug fixes/small feature improvements * feat: General UI improvements and translation updates * fix: Command history and file manager styling issues * feat: General bug fixes, added server stat commands, improved split screen, link accounts, etc * fix: add Accept header for OIDC callback request (#436) * Delete DOWNLOADS.md * fix: add Accept header for OIDC callback request --------- Co-authored-by: Luke Gustafson <88517757+LukeGus@users.noreply.github.com> * fix: More bug fixes and QOL fixes * fix: Server stats not respecting interval and fixed SSH toool type issues * fix: Remove github links * fix: Delete account spacing * fix: Increment version * fix: Unable to delete hosts and add nginx for terminal * fix: Unable to delete hosts * fix: Unable to delete hosts * fix: Unable to delete hosts * fix: OIDC/local account linking breaking both logins * chore: File cleanup * feat: Max terminal tab size and save current file manager sorting type * fix: Terminal display issue, migrate host editor to use combobox * feat: Add snippet folder/customization system * fix: Fix OIDC linking and prep release * fix: Increment version --------- Co-authored-by: ZacharyZcR <zacharyzcr1984@gmail.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> Co-authored-by: Max <herzmaximilian@gmail.com> Co-authored-by: SlimGary <trash.slim@gmail.com> Co-authored-by: jarrah31 <jarrah31@gmail.com> Co-authored-by: Kf637 <mail@kf637.tech>
This commit was merged in pull request #437.
This commit is contained in:
@@ -12,6 +12,7 @@ interface EncryptedFileMetadata {
|
||||
algorithm: string;
|
||||
keySource?: string;
|
||||
salt?: string;
|
||||
dataSize?: number;
|
||||
}
|
||||
|
||||
class DatabaseFileEncryption {
|
||||
@@ -25,11 +26,12 @@ class DatabaseFileEncryption {
|
||||
buffer: Buffer,
|
||||
targetPath: string,
|
||||
): Promise<string> {
|
||||
const tmpPath = `${targetPath}.tmp-${Date.now()}-${process.pid}`;
|
||||
const metadataPath = `${targetPath}${this.METADATA_FILE_SUFFIX}`;
|
||||
|
||||
try {
|
||||
const key = await this.systemCrypto.getDatabaseKey();
|
||||
|
||||
const iv = crypto.randomBytes(16);
|
||||
|
||||
const cipher = crypto.createCipheriv(
|
||||
this.ALGORITHM,
|
||||
key,
|
||||
@@ -45,14 +47,55 @@ class DatabaseFileEncryption {
|
||||
fingerprint: "termix-v2-systemcrypto",
|
||||
algorithm: this.ALGORITHM,
|
||||
keySource: "SystemCrypto",
|
||||
dataSize: encrypted.length,
|
||||
};
|
||||
|
||||
const metadataPath = `${targetPath}${this.METADATA_FILE_SUFFIX}`;
|
||||
fs.writeFileSync(targetPath, encrypted);
|
||||
fs.writeFileSync(metadataPath, JSON.stringify(metadata, null, 2));
|
||||
const metadataJson = JSON.stringify(metadata, null, 2);
|
||||
const metadataBuffer = Buffer.from(metadataJson, "utf8");
|
||||
const metadataLengthBuffer = Buffer.alloc(4);
|
||||
metadataLengthBuffer.writeUInt32BE(metadataBuffer.length, 0);
|
||||
|
||||
const finalBuffer = Buffer.concat([
|
||||
metadataLengthBuffer,
|
||||
metadataBuffer,
|
||||
encrypted,
|
||||
]);
|
||||
|
||||
fs.writeFileSync(tmpPath, finalBuffer);
|
||||
fs.renameSync(tmpPath, targetPath);
|
||||
|
||||
try {
|
||||
if (fs.existsSync(metadataPath)) {
|
||||
fs.unlinkSync(metadataPath);
|
||||
}
|
||||
} catch (cleanupError) {
|
||||
databaseLogger.warn("Failed to cleanup old metadata file", {
|
||||
operation: "old_meta_cleanup_failed",
|
||||
path: metadataPath,
|
||||
error:
|
||||
cleanupError instanceof Error
|
||||
? cleanupError.message
|
||||
: "Unknown error",
|
||||
});
|
||||
}
|
||||
|
||||
return targetPath;
|
||||
} catch (error) {
|
||||
try {
|
||||
if (fs.existsSync(tmpPath)) {
|
||||
fs.unlinkSync(tmpPath);
|
||||
}
|
||||
} catch (cleanupError) {
|
||||
databaseLogger.warn("Failed to cleanup temporary files", {
|
||||
operation: "temp_file_cleanup_failed",
|
||||
tmpPath,
|
||||
error:
|
||||
cleanupError instanceof Error
|
||||
? cleanupError.message
|
||||
: "Unknown error",
|
||||
});
|
||||
}
|
||||
|
||||
databaseLogger.error("Failed to encrypt database buffer", error, {
|
||||
operation: "database_buffer_encryption_failed",
|
||||
targetPath,
|
||||
@@ -74,6 +117,8 @@ class DatabaseFileEncryption {
|
||||
const encryptedPath =
|
||||
targetPath || `${sourcePath}${this.ENCRYPTED_FILE_SUFFIX}`;
|
||||
const metadataPath = `${encryptedPath}${this.METADATA_FILE_SUFFIX}`;
|
||||
const tmpPath = `${encryptedPath}.tmp-${Date.now()}-${process.pid}`;
|
||||
const tmpMetadataPath = `${tmpPath}${this.METADATA_FILE_SUFFIX}`;
|
||||
|
||||
try {
|
||||
const sourceData = fs.readFileSync(sourcePath);
|
||||
@@ -93,6 +138,12 @@ class DatabaseFileEncryption {
|
||||
]);
|
||||
const tag = cipher.getAuthTag();
|
||||
|
||||
const keyFingerprint = crypto
|
||||
.createHash("sha256")
|
||||
.update(key)
|
||||
.digest("hex")
|
||||
.substring(0, 16);
|
||||
|
||||
const metadata: EncryptedFileMetadata = {
|
||||
iv: iv.toString("hex"),
|
||||
tag: tag.toString("hex"),
|
||||
@@ -100,10 +151,14 @@ class DatabaseFileEncryption {
|
||||
fingerprint: "termix-v2-systemcrypto",
|
||||
algorithm: this.ALGORITHM,
|
||||
keySource: "SystemCrypto",
|
||||
dataSize: encrypted.length,
|
||||
};
|
||||
|
||||
fs.writeFileSync(encryptedPath, encrypted);
|
||||
fs.writeFileSync(metadataPath, JSON.stringify(metadata, null, 2));
|
||||
fs.writeFileSync(tmpPath, encrypted);
|
||||
fs.writeFileSync(tmpMetadataPath, JSON.stringify(metadata, null, 2));
|
||||
|
||||
fs.renameSync(tmpPath, encryptedPath);
|
||||
fs.renameSync(tmpMetadataPath, metadataPath);
|
||||
|
||||
databaseLogger.info("Database file encrypted successfully", {
|
||||
operation: "database_file_encryption",
|
||||
@@ -111,11 +166,30 @@ class DatabaseFileEncryption {
|
||||
encryptedPath,
|
||||
fileSize: sourceData.length,
|
||||
encryptedSize: encrypted.length,
|
||||
keyFingerprint,
|
||||
fingerprintPrefix: metadata.fingerprint,
|
||||
});
|
||||
|
||||
return encryptedPath;
|
||||
} catch (error) {
|
||||
try {
|
||||
if (fs.existsSync(tmpPath)) {
|
||||
fs.unlinkSync(tmpPath);
|
||||
}
|
||||
if (fs.existsSync(tmpMetadataPath)) {
|
||||
fs.unlinkSync(tmpMetadataPath);
|
||||
}
|
||||
} catch (cleanupError) {
|
||||
databaseLogger.warn("Failed to cleanup temporary files", {
|
||||
operation: "temp_file_cleanup_failed",
|
||||
tmpPath,
|
||||
error:
|
||||
cleanupError instanceof Error
|
||||
? cleanupError.message
|
||||
: "Unknown error",
|
||||
});
|
||||
}
|
||||
|
||||
databaseLogger.error("Failed to encrypt database file", error, {
|
||||
operation: "database_file_encryption_failed",
|
||||
sourcePath,
|
||||
@@ -134,16 +208,69 @@ class DatabaseFileEncryption {
|
||||
);
|
||||
}
|
||||
|
||||
const metadataPath = `${encryptedPath}${this.METADATA_FILE_SUFFIX}`;
|
||||
if (!fs.existsSync(metadataPath)) {
|
||||
throw new Error(`Metadata file does not exist: ${metadataPath}`);
|
||||
let metadata: EncryptedFileMetadata;
|
||||
let encryptedData: Buffer;
|
||||
|
||||
const fileBuffer = fs.readFileSync(encryptedPath);
|
||||
|
||||
try {
|
||||
const metadataLength = fileBuffer.readUInt32BE(0);
|
||||
const metadataEnd = 4 + metadataLength;
|
||||
|
||||
if (
|
||||
metadataLength <= 0 ||
|
||||
metadataEnd > fileBuffer.length ||
|
||||
metadataEnd <= 4
|
||||
) {
|
||||
throw new Error("Invalid metadata length in single-file format");
|
||||
}
|
||||
|
||||
const metadataJson = fileBuffer.slice(4, metadataEnd).toString("utf8");
|
||||
metadata = JSON.parse(metadataJson);
|
||||
encryptedData = fileBuffer.slice(metadataEnd);
|
||||
|
||||
if (!metadata.iv || !metadata.tag || !metadata.version) {
|
||||
throw new Error("Invalid metadata structure in single-file format");
|
||||
}
|
||||
} catch (singleFileError) {
|
||||
const metadataPath = `${encryptedPath}${this.METADATA_FILE_SUFFIX}`;
|
||||
if (!fs.existsSync(metadataPath)) {
|
||||
throw new Error(
|
||||
`Could not read database: Not a valid single-file format and metadata file is missing: ${metadataPath}. Error: ${singleFileError.message}`,
|
||||
);
|
||||
}
|
||||
|
||||
try {
|
||||
const metadataContent = fs.readFileSync(metadataPath, "utf8");
|
||||
metadata = JSON.parse(metadataContent);
|
||||
encryptedData = fileBuffer;
|
||||
} catch (twoFileError) {
|
||||
throw new Error(
|
||||
`Failed to read database using both single-file and two-file formats. Error: ${twoFileError.message}`,
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
try {
|
||||
const metadataContent = fs.readFileSync(metadataPath, "utf8");
|
||||
const metadata: EncryptedFileMetadata = JSON.parse(metadataContent);
|
||||
|
||||
const encryptedData = fs.readFileSync(encryptedPath);
|
||||
if (
|
||||
metadata.dataSize !== undefined &&
|
||||
encryptedData.length !== metadata.dataSize
|
||||
) {
|
||||
databaseLogger.error(
|
||||
"Encrypted file size mismatch - possible corrupted write or mismatched metadata",
|
||||
null,
|
||||
{
|
||||
operation: "database_file_size_mismatch",
|
||||
encryptedPath,
|
||||
actualSize: encryptedData.length,
|
||||
expectedSize: metadata.dataSize,
|
||||
},
|
||||
);
|
||||
throw new Error(
|
||||
`Encrypted file size mismatch: expected ${metadata.dataSize} bytes but got ${encryptedData.length} bytes. ` +
|
||||
`This indicates corrupted files or interrupted write operation.`,
|
||||
);
|
||||
}
|
||||
|
||||
let key: Buffer;
|
||||
if (metadata.version === "v2") {
|
||||
@@ -181,13 +308,67 @@ class DatabaseFileEncryption {
|
||||
|
||||
return decryptedBuffer;
|
||||
} catch (error) {
|
||||
const errorMessage =
|
||||
error instanceof Error ? error.message : "Unknown error";
|
||||
const isAuthError =
|
||||
errorMessage.includes("Unsupported state") ||
|
||||
errorMessage.includes("authenticate data") ||
|
||||
errorMessage.includes("auth");
|
||||
|
||||
if (isAuthError) {
|
||||
const dataDir = process.env.DATA_DIR || "./db/data";
|
||||
const envPath = path.join(dataDir, ".env");
|
||||
|
||||
let envFileExists = false;
|
||||
let envFileReadable = false;
|
||||
try {
|
||||
envFileExists = fs.existsSync(envPath);
|
||||
if (envFileExists) {
|
||||
fs.accessSync(envPath, fs.constants.R_OK);
|
||||
envFileReadable = true;
|
||||
}
|
||||
} catch (error) {
|
||||
databaseLogger.debug("Operation failed, continuing", {
|
||||
error: error instanceof Error ? error.message : String(error),
|
||||
});
|
||||
}
|
||||
|
||||
databaseLogger.error(
|
||||
"Database decryption authentication failed - possible causes: wrong DATABASE_KEY, corrupted files, or interrupted write",
|
||||
error,
|
||||
{
|
||||
operation: "database_buffer_decryption_auth_failed",
|
||||
encryptedPath,
|
||||
dataDir,
|
||||
envPath,
|
||||
envFileExists,
|
||||
envFileReadable,
|
||||
hasEnvKey: !!process.env.DATABASE_KEY,
|
||||
envKeyLength: process.env.DATABASE_KEY?.length || 0,
|
||||
suggestion:
|
||||
"Check if DATABASE_KEY in .env matches the key used for encryption",
|
||||
},
|
||||
);
|
||||
throw new Error(
|
||||
`Database decryption authentication failed. This usually means:\n` +
|
||||
`1. DATABASE_KEY has changed or is missing from ${dataDir}/.env\n` +
|
||||
`2. Encrypted file was corrupted during write (system crash/restart)\n` +
|
||||
`3. Metadata file does not match encrypted data\n` +
|
||||
`\nDebug info:\n` +
|
||||
`- DATA_DIR: ${dataDir}\n` +
|
||||
`- .env file exists: ${envFileExists}\n` +
|
||||
`- .env file readable: ${envFileReadable}\n` +
|
||||
`- DATABASE_KEY in environment: ${!!process.env.DATABASE_KEY}\n` +
|
||||
`Original error: ${errorMessage}`,
|
||||
);
|
||||
}
|
||||
|
||||
databaseLogger.error("Failed to decrypt database to buffer", error, {
|
||||
operation: "database_buffer_decryption_failed",
|
||||
encryptedPath,
|
||||
errorMessage,
|
||||
});
|
||||
throw new Error(
|
||||
`Database buffer decryption failed: ${error instanceof Error ? error.message : "Unknown error"}`,
|
||||
);
|
||||
throw new Error(`Database buffer decryption failed: ${errorMessage}`);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -215,6 +396,26 @@ class DatabaseFileEncryption {
|
||||
|
||||
const encryptedData = fs.readFileSync(encryptedPath);
|
||||
|
||||
if (
|
||||
metadata.dataSize !== undefined &&
|
||||
encryptedData.length !== metadata.dataSize
|
||||
) {
|
||||
databaseLogger.error(
|
||||
"Encrypted file size mismatch - possible corrupted write or mismatched metadata",
|
||||
null,
|
||||
{
|
||||
operation: "database_file_size_mismatch",
|
||||
encryptedPath,
|
||||
actualSize: encryptedData.length,
|
||||
expectedSize: metadata.dataSize,
|
||||
},
|
||||
);
|
||||
throw new Error(
|
||||
`Encrypted file size mismatch: expected ${metadata.dataSize} bytes but got ${encryptedData.length} bytes. ` +
|
||||
`This indicates corrupted files or interrupted write operation.`,
|
||||
);
|
||||
}
|
||||
|
||||
let key: Buffer;
|
||||
if (metadata.version === "v2") {
|
||||
key = await this.systemCrypto.getDatabaseKey();
|
||||
@@ -274,18 +475,43 @@ class DatabaseFileEncryption {
|
||||
}
|
||||
|
||||
static isEncryptedDatabaseFile(filePath: string): boolean {
|
||||
const metadataPath = `${filePath}${this.METADATA_FILE_SUFFIX}`;
|
||||
|
||||
if (!fs.existsSync(filePath) || !fs.existsSync(metadataPath)) {
|
||||
if (!fs.existsSync(filePath)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
const metadataPath = `${filePath}${this.METADATA_FILE_SUFFIX}`;
|
||||
if (fs.existsSync(metadataPath)) {
|
||||
try {
|
||||
const metadataContent = fs.readFileSync(metadataPath, "utf8");
|
||||
const metadata: EncryptedFileMetadata = JSON.parse(metadataContent);
|
||||
return (
|
||||
metadata.version === this.VERSION &&
|
||||
metadata.algorithm === this.ALGORITHM
|
||||
);
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
try {
|
||||
const metadataContent = fs.readFileSync(metadataPath, "utf8");
|
||||
const metadata: EncryptedFileMetadata = JSON.parse(metadataContent);
|
||||
const fileBuffer = fs.readFileSync(filePath);
|
||||
if (fileBuffer.length < 4) return false;
|
||||
|
||||
const metadataLength = fileBuffer.readUInt32BE(0);
|
||||
const metadataEnd = 4 + metadataLength;
|
||||
|
||||
if (metadataLength <= 0 || metadataEnd > fileBuffer.length) {
|
||||
return false;
|
||||
}
|
||||
|
||||
const metadataJson = fileBuffer.slice(4, metadataEnd).toString("utf8");
|
||||
const metadata: EncryptedFileMetadata = JSON.parse(metadataJson);
|
||||
|
||||
return (
|
||||
metadata.version === this.VERSION &&
|
||||
metadata.algorithm === this.ALGORITHM
|
||||
metadata.algorithm === this.ALGORITHM &&
|
||||
!!metadata.iv &&
|
||||
!!metadata.tag
|
||||
);
|
||||
} catch {
|
||||
return false;
|
||||
@@ -322,6 +548,125 @@ class DatabaseFileEncryption {
|
||||
}
|
||||
}
|
||||
|
||||
static getDiagnosticInfo(encryptedPath: string): {
|
||||
dataFile: {
|
||||
exists: boolean;
|
||||
size?: number;
|
||||
mtime?: string;
|
||||
readable?: boolean;
|
||||
};
|
||||
metadataFile: {
|
||||
exists: boolean;
|
||||
size?: number;
|
||||
mtime?: string;
|
||||
readable?: boolean;
|
||||
content?: EncryptedFileMetadata;
|
||||
};
|
||||
environment: {
|
||||
dataDir: string;
|
||||
envPath: string;
|
||||
envFileExists: boolean;
|
||||
envFileReadable: boolean;
|
||||
hasEnvKey: boolean;
|
||||
envKeyLength: number;
|
||||
};
|
||||
validation: {
|
||||
filesConsistent: boolean;
|
||||
sizeMismatch?: boolean;
|
||||
expectedSize?: number;
|
||||
actualSize?: number;
|
||||
};
|
||||
} {
|
||||
const metadataPath = `${encryptedPath}${this.METADATA_FILE_SUFFIX}`;
|
||||
const dataDir = process.env.DATA_DIR || "./db/data";
|
||||
const envPath = path.join(dataDir, ".env");
|
||||
|
||||
const result: ReturnType<typeof this.getDiagnosticInfo> = {
|
||||
dataFile: { exists: false },
|
||||
metadataFile: { exists: false },
|
||||
environment: {
|
||||
dataDir,
|
||||
envPath,
|
||||
envFileExists: false,
|
||||
envFileReadable: false,
|
||||
hasEnvKey: !!process.env.DATABASE_KEY,
|
||||
envKeyLength: process.env.DATABASE_KEY?.length || 0,
|
||||
},
|
||||
validation: {
|
||||
filesConsistent: false,
|
||||
},
|
||||
};
|
||||
|
||||
try {
|
||||
result.dataFile.exists = fs.existsSync(encryptedPath);
|
||||
if (result.dataFile.exists) {
|
||||
try {
|
||||
fs.accessSync(encryptedPath, fs.constants.R_OK);
|
||||
result.dataFile.readable = true;
|
||||
const stats = fs.statSync(encryptedPath);
|
||||
result.dataFile.size = stats.size;
|
||||
result.dataFile.mtime = stats.mtime.toISOString();
|
||||
} catch {
|
||||
result.dataFile.readable = false;
|
||||
}
|
||||
}
|
||||
|
||||
result.metadataFile.exists = fs.existsSync(metadataPath);
|
||||
if (result.metadataFile.exists) {
|
||||
try {
|
||||
fs.accessSync(metadataPath, fs.constants.R_OK);
|
||||
result.metadataFile.readable = true;
|
||||
const stats = fs.statSync(metadataPath);
|
||||
result.metadataFile.size = stats.size;
|
||||
result.metadataFile.mtime = stats.mtime.toISOString();
|
||||
|
||||
const content = fs.readFileSync(metadataPath, "utf8");
|
||||
result.metadataFile.content = JSON.parse(content);
|
||||
} catch {
|
||||
result.metadataFile.readable = false;
|
||||
}
|
||||
}
|
||||
|
||||
result.environment.envFileExists = fs.existsSync(envPath);
|
||||
if (result.environment.envFileExists) {
|
||||
try {
|
||||
fs.accessSync(envPath, fs.constants.R_OK);
|
||||
result.environment.envFileReadable = true;
|
||||
} catch (error) {}
|
||||
}
|
||||
|
||||
if (
|
||||
result.dataFile.exists &&
|
||||
result.metadataFile.exists &&
|
||||
result.metadataFile.content
|
||||
) {
|
||||
result.validation.filesConsistent = true;
|
||||
|
||||
if (result.metadataFile.content.dataSize !== undefined) {
|
||||
result.validation.expectedSize = result.metadataFile.content.dataSize;
|
||||
result.validation.actualSize = result.dataFile.size;
|
||||
result.validation.sizeMismatch =
|
||||
result.metadataFile.content.dataSize !== result.dataFile.size;
|
||||
if (result.validation.sizeMismatch) {
|
||||
result.validation.filesConsistent = false;
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
databaseLogger.error("Failed to generate diagnostic info", error, {
|
||||
operation: "diagnostic_info_failed",
|
||||
encryptedPath,
|
||||
});
|
||||
}
|
||||
|
||||
databaseLogger.info("Database encryption diagnostic info", {
|
||||
operation: "diagnostic_info_generated",
|
||||
...result,
|
||||
});
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
static async createEncryptedBackup(
|
||||
databasePath: string,
|
||||
backupDir: string,
|
||||
|
||||
Reference in New Issue
Block a user