File Handling & Media Processing - 1/2

The $25 Million Upload Catastrophe That Crashed Valentine’s Day

Picture this nightmare: February 14th, 7:00 AM PST. The most anticipated social media platform for couples is about to experience its biggest day of the year. Users worldwide are preparing to upload millions of photos, videos, and voice messages to celebrate Valentine’s Day. The infrastructure has been scaled, the servers are ready, and the CDN is primed for the traffic surge.

Then 8:00 AM hits, and digital chaos erupts.

Within the first hour, what started as romantic gestures becomes a technical horror story:

  • Users uploading 20MB wedding photos that bring the server to its knees because someone forgot to implement file size limits
  • A single user uploads a 2GB raw video file that consumes the entire upload queue for 45 minutes
  • Image processing servers crash trying to generate thumbnails from malformed image files that passed “validation”
  • Videos uploaded in obscure formats crash the media processing pipeline when ffmpeg encounters unsupported codecs
  • The database fills up with base64-encoded images because someone thought storing files in the database was a “simple solution”
  • CDN costs skyrocket to $50,000 per hour because original 4K videos are being served instead of optimized versions

But here’s where it gets catastrophic—their file handling system, designed for “normal” usage, completely collapses under real-world file diversity:

  • Memory explosion: Loading entire files into memory causes servers to crash with out-of-memory errors
  • Storage overflow: Local disk storage fills up in 3 hours, bringing down all file upload functionality
  • Processing bottleneck: Single-threaded image processing creates a 6-hour backlog of unprocessed media
  • Security nightmare: Users upload executable files disguised as images, creating massive security vulnerabilities
  • Metadata chaos: No file metadata tracking leads to duplicate processing and storage bloat
  • Format incompatibility: 40% of uploaded videos can’t be played on mobile devices due to lack of format conversion

By noon, the damage was astronomical:

  • $25 million in lost revenue from premium subscriptions refunded due to service failures
  • 2.3 million users unable to share their Valentine’s Day moments
  • 850GB of corrupted media files requiring manual recovery
  • 1.2 million duplicate files consuming unnecessary storage
  • Complete breakdown of the content moderation system due to processing failures
  • Emergency migration to cloud storage costing $180,000 in rush implementation fees

The brutal truth? They had built a file handling system that worked fine for a few test images but had never been designed to handle the scale, diversity, and chaos of real-world media uploads.

The Uncomfortable Truth About File Handling at Scale

Here’s what separates applications that elegantly handle millions of files from those that crumble when someone uploads their first high-resolution video: File handling isn’t just about storing bytes—it’s about building intelligent processing pipelines that can validate, transform, optimize, and serve media efficiently while maintaining security, performance, and user experience.

Most developers approach file handling like this:

  1. Accept file uploads and hope they’re reasonable sizes
  2. Store everything in the local file system and cross fingers
  3. Process images and videos synchronously and pray nothing times out
  4. Serve original files directly and wonder why the CDN bill is enormous
  5. Discover during peak traffic that file processing is a distributed systems problem requiring sophisticated solutions

But systems that handle real-world file complexity work differently:

  1. Design upload systems with intelligent validation, size limits, and security checks from day one
  2. Implement asynchronous processing pipelines that can handle any file format gracefully
  3. Build comprehensive media optimization that reduces storage costs and improves user experience
  4. Plan storage strategies that scale from gigabytes to petabytes without breaking the bank
  5. Treat file handling as a critical infrastructure component requiring monitoring, backup, and disaster recovery

The difference isn’t just reliability—it’s the difference between systems that enhance user experience through fast, optimized media delivery and systems where file uploads become the bottleneck that brings down your entire application.

Ready to build file handling that works like Instagram’s media pipeline instead of that photo upload form that crashes when someone tries to share their vacation photos? Let’s dive into the patterns that power bulletproof file handling.


File Upload and Validation: The Gateway to Media Chaos

Production-Ready File Upload System

// Comprehensive file upload system with validation, security, and processing
class FileUploadManager {
  private storageProvider: StorageProvider;
  private validationEngine: FileValidationEngine;
  private processingQueue: MediaProcessingQueue;
  private securityScanner: FileSecurityScanner;
  private metadataExtractor: FileMetadataExtractor;
  private uploadMetrics: UploadMetrics;
  private virusScanner: VirusScanner;

  constructor(config: FileUploadConfig) {
    this.storageProvider = this.initializeStorageProvider(config.storage);
    this.validationEngine = new FileValidationEngine(config.validation);
    this.processingQueue = new MediaProcessingQueue(config.processing);
    this.securityScanner = new FileSecurityScanner(config.security);
    this.metadataExtractor = new FileMetadataExtractor();
    this.uploadMetrics = new UploadMetrics();
    this.virusScanner = new VirusScanner(config.virusScanning);

    this.setupCleanupTasks();
    this.setupMonitoring();
  }

  // Handle file upload with comprehensive validation and security
  async uploadFile(
    uploadRequest: FileUploadRequest,
    uploadOptions: UploadOptions = {}
  ): Promise<FileUploadResult> {
    const uploadId = this.generateUploadId();
    const startTime = Date.now();

    try {
      console.log(`Starting file upload: ${uploadId}`);

      // Pre-upload validation
      await this.validateUploadRequest(uploadRequest, uploadOptions);

      // Create temporary file for processing
      const tempFile = await this.createTemporaryFile(uploadRequest, uploadId);

      try {
        // Multi-stage validation pipeline
        const validationResult = await this.runValidationPipeline(
          tempFile,
          uploadRequest,
          uploadId
        );

        if (!validationResult.isValid) {
          throw new FileUploadError(
            "File validation failed",
            "VALIDATION_FAILED",
            400,
            { errors: validationResult.errors }
          );
        }

        // Extract file metadata
        const metadata = await this.extractFileMetadata(
          tempFile,
          uploadRequest
        );

        // Security scanning
        await this.performSecurityScan(tempFile, metadata, uploadId);

        // Move to permanent storage
        const storageResult = await this.moveToStorage(
          tempFile,
          metadata,
          uploadOptions
        );

        // Queue for async processing if needed
        if (this.requiresProcessing(metadata)) {
          await this.queueForProcessing(storageResult, metadata, uploadOptions);
        }

        // Record metrics
        await this.recordUploadMetrics(uploadId, metadata, startTime);

        const result: FileUploadResult = {
          uploadId,
          fileId: storageResult.fileId,
          filename: metadata.originalName,
          mimeType: metadata.mimeType,
          size: metadata.size,
          url: storageResult.url,
          thumbnailUrl: storageResult.thumbnailUrl,
          metadata: this.sanitizeMetadataForResponse(metadata),
          processingStatus: this.requiresProcessing(metadata)
            ? "queued"
            : "completed",
          uploadedAt: Date.now(),
          expiresAt: uploadOptions.ttl ? Date.now() + uploadOptions.ttl : null,
        };

        console.log(`File upload completed successfully: ${uploadId}`);
        return result;
      } finally {
        // Always cleanup temporary file
        await this.cleanupTemporaryFile(tempFile);
      }
    } catch (error) {
      await this.recordUploadError(uploadId, error, startTime);
      console.error(`File upload failed: ${uploadId}`, error);
      throw error;
    }
  }

  private async validateUploadRequest(
    request: FileUploadRequest,
    options: UploadOptions
  ): Promise<void> {
    // Check user permissions
    if (!request.userId) {
      throw new FileUploadError(
        "User authentication required",
        "UNAUTHORIZED",
        401
      );
    }

    // Check quota limits
    const userQuota = await this.checkUserQuota(request.userId);
    if (userQuota.used + request.size > userQuota.limit) {
      throw new FileUploadError(
        "Upload would exceed user quota",
        "QUOTA_EXCEEDED",
        413,
        {
          currentUsage: userQuota.used,
          limit: userQuota.limit,
          requestedSize: request.size,
        }
      );
    }

    // Rate limiting check
    const rateLimitResult = await this.checkRateLimit(request.userId);
    if (!rateLimitResult.allowed) {
      throw new FileUploadError(
        "Upload rate limit exceeded",
        "RATE_LIMITED",
        429,
        { retryAfter: rateLimitResult.retryAfter }
      );
    }

    // Basic file size validation
    if (request.size > this.getMaxFileSize(request.fileType)) {
      throw new FileUploadError(
        "File size exceeds limit",
        "FILE_TOO_LARGE",
        413,
        {
          size: request.size,
          maxSize: this.getMaxFileSize(request.fileType),
        }
      );
    }

    // File type validation
    if (!this.isAllowedFileType(request.mimeType, request.fileType)) {
      throw new FileUploadError(
        "File type not allowed",
        "INVALID_FILE_TYPE",
        400,
        {
          mimeType: request.mimeType,
          allowedTypes: this.getAllowedTypes(request.fileType),
        }
      );
    }
  }

  private async runValidationPipeline(
    tempFile: TemporaryFile,
    request: FileUploadRequest,
    uploadId: string
  ): Promise<ValidationResult> {
    const validators: FileValidator[] = [
      new MimeTypeValidator(),
      new FileSizeValidator(),
      new FileSignatureValidator(),
      new ImageValidator(),
      new VideoValidator(),
      new DocumentValidator(),
      new MalwareValidator(),
    ];

    const results: ValidationStepResult[] = [];

    for (const validator of validators) {
      if (await validator.shouldValidate(tempFile, request)) {
        try {
          console.log(`Running validator: ${validator.name} for ${uploadId}`);

          const stepResult = await validator.validate(tempFile, request);
          results.push(stepResult);

          if (!stepResult.isValid && stepResult.severity === "critical") {
            // Stop validation pipeline on critical failures
            break;
          }
        } catch (error) {
          console.error(
            `Validator ${validator.name} failed for ${uploadId}:`,
            error
          );
          results.push({
            validatorName: validator.name,
            isValid: false,
            severity: "critical",
            errors: [`Validation failed: ${error.message}`],
          });
        }
      }
    }

    const criticalErrors = results.filter(
      (r) => !r.isValid && r.severity === "critical"
    );
    const warnings = results.filter(
      (r) => !r.isValid && r.severity === "warning"
    );

    return {
      isValid: criticalErrors.length === 0,
      errors: criticalErrors.flatMap((r) => r.errors),
      warnings: warnings.flatMap((r) => r.errors),
      validationSteps: results,
    };
  }

  private async extractFileMetadata(
    tempFile: TemporaryFile,
    request: FileUploadRequest
  ): Promise<FileMetadata> {
    try {
      const basicMetadata = await this.getBasicFileMetadata(tempFile);
      const extendedMetadata = await this.metadataExtractor.extract(
        tempFile.path,
        request.mimeType
      );

      return {
        fileId: this.generateFileId(),
        originalName: request.filename,
        sanitizedName: this.sanitizeFilename(request.filename),
        mimeType: basicMetadata.mimeType,
        size: basicMetadata.size,
        checksum: basicMetadata.checksum,
        uploadedBy: request.userId,
        uploadedAt: Date.now(),

        // Extended metadata based on file type
        ...extendedMetadata,

        // Security metadata
        scanResults: [],
        riskLevel: "unknown",
      };
    } catch (error) {
      console.error("Metadata extraction failed:", error);
      throw new FileUploadError(
        "Failed to extract file metadata",
        "METADATA_EXTRACTION_FAILED",
        500,
        { originalError: error }
      );
    }
  }

  private async performSecurityScan(
    tempFile: TemporaryFile,
    metadata: FileMetadata,
    uploadId: string
  ): Promise<void> {
    console.log(`Performing security scan: ${uploadId}`);

    // File signature verification
    const signatureValid = await this.securityScanner.verifyFileSignature(
      tempFile.path,
      metadata.mimeType
    );

    if (!signatureValid) {
      throw new FileUploadError(
        "File signature does not match declared type",
        "INVALID_FILE_SIGNATURE",
        400
      );
    }

    // Executable detection
    if (await this.securityScanner.containsExecutableContent(tempFile.path)) {
      throw new FileUploadError(
        "File contains executable content",
        "EXECUTABLE_CONTENT_DETECTED",
        400
      );
    }

    // Virus scanning
    const virusScanResult = await this.virusScanner.scan(tempFile.path);
    if (virusScanResult.infected) {
      await this.quarantineFile(tempFile, virusScanResult);
      throw new FileUploadError(
        "File contains malware",
        "MALWARE_DETECTED",
        400,
        { threats: virusScanResult.threats }
      );
    }

    // Content-based security checks
    await this.performContentSecurityChecks(tempFile, metadata);

    // Update metadata with security scan results
    metadata.scanResults = [
      {
        scanner: "signature_verification",
        result: "clean",
        timestamp: Date.now(),
      },
      {
        scanner: "virus_scan",
        result: virusScanResult.clean ? "clean" : "infected",
        timestamp: Date.now(),
        details: virusScanResult,
      },
    ];

    metadata.riskLevel = this.calculateRiskLevel(metadata);
  }

  private async moveToStorage(
    tempFile: TemporaryFile,
    metadata: FileMetadata,
    options: UploadOptions
  ): Promise<StorageResult> {
    const storageKey = this.generateStorageKey(metadata, options);

    console.log(`Moving file to storage: ${metadata.fileId}`);

    try {
      // Upload to primary storage
      const uploadResult = await this.storageProvider.upload(
        tempFile.path,
        storageKey,
        {
          mimeType: metadata.mimeType,
          metadata: this.getStorageMetadata(metadata),
          encryption: options.encrypt !== false,
          redundancy: options.redundancy || "standard",
        }
      );

      // Generate optimized versions if needed
      let thumbnailUrl: string | null = null;
      if (this.shouldGenerateThumbnail(metadata)) {
        thumbnailUrl = await this.generateThumbnail(
          tempFile,
          metadata,
          storageKey
        );
      }

      return {
        fileId: metadata.fileId,
        storageKey,
        url: uploadResult.url,
        thumbnailUrl,
        size: uploadResult.size,
        etag: uploadResult.etag,
      };
    } catch (error) {
      console.error("Storage upload failed:", error);
      throw new FileUploadError("Failed to store file", "STORAGE_FAILED", 500, {
        originalError: error,
      });
    }
  }

  // Batch file upload with progress tracking
  async uploadFiles(
    files: FileUploadRequest[],
    options: BatchUploadOptions = {}
  ): Promise<BatchUploadResult> {
    const batchId = this.generateBatchId();
    const concurrency = options.concurrency || 3;

    console.log(`Starting batch upload: ${batchId} (${files.length} files)`);

    const results: FileUploadResult[] = [];
    const errors: BatchUploadError[] = [];
    const executing: Promise<any>[] = [];

    for (let i = 0; i < files.length; i++) {
      const file = files[i];

      const promise = this.uploadFile(file, {
        ...options,
        batchId,
        batchIndex: i,
      }).then(
        (result) => {
          results.push(result);
          return result;
        },
        (error) => {
          const batchError: BatchUploadError = {
            index: i,
            filename: file.filename,
            error: error.message,
            code: error.code,
          };
          errors.push(batchError);
          return batchError;
        }
      );

      executing.push(promise);

      // Control concurrency
      if (executing.length >= concurrency) {
        await Promise.race(executing);
        const completedIndex = executing.findIndex((p) => p === promise);
        if (completedIndex !== -1) {
          executing.splice(completedIndex, 1);
        }
      }
    }

    // Wait for all remaining uploads
    await Promise.allSettled(executing);

    return {
      batchId,
      totalFiles: files.length,
      successfulUploads: results.length,
      failedUploads: errors.length,
      results,
      errors,
      completedAt: Date.now(),
    };
  }

  // Chunked upload for large files
  async uploadLargeFile(
    request: LargeFileUploadRequest,
    options: LargeFileUploadOptions = {}
  ): Promise<FileUploadResult> {
    const uploadId = this.generateUploadId();
    const chunkSize = options.chunkSize || 5 * 1024 * 1024; // 5MB chunks
    const totalChunks = Math.ceil(request.totalSize / chunkSize);

    console.log(
      `Starting large file upload: ${uploadId} (${request.totalSize} bytes, ${totalChunks} chunks)`
    );

    try {
      // Initialize multipart upload
      const multipartUpload = await this.storageProvider.createMultipartUpload({
        key: this.generateStorageKey(
          { originalName: request.filename },
          options
        ),
        mimeType: request.mimeType,
        metadata: {
          originalName: request.filename,
          totalSize: request.totalSize,
          uploadId,
        },
      });

      // Track upload progress
      const uploadProgress = {
        uploadId,
        totalChunks,
        completedChunks: 0,
        uploadedBytes: 0,
        startTime: Date.now(),
      };

      const chunkResults: ChunkUploadResult[] = [];

      // Upload chunks with retry logic
      for (let chunkIndex = 0; chunkIndex < totalChunks; chunkIndex++) {
        const chunkStart = chunkIndex * chunkSize;
        const chunkEnd = Math.min(chunkStart + chunkSize, request.totalSize);
        const chunkData = request.getChunk(chunkStart, chunkEnd);

        const chunkResult = await this.uploadChunkWithRetry(
          multipartUpload.uploadId,
          chunkIndex + 1,
          chunkData,
          options.maxRetries || 3
        );

        chunkResults.push(chunkResult);
        uploadProgress.completedChunks++;
        uploadProgress.uploadedBytes += chunkData.length;

        // Report progress
        if (options.onProgress) {
          options.onProgress(uploadProgress);
        }
      }

      // Complete multipart upload
      const finalResult = await this.storageProvider.completeMultipartUpload(
        multipartUpload.uploadId,
        chunkResults
      );

      console.log(`Large file upload completed: ${uploadId}`);

      return {
        uploadId,
        fileId: finalResult.fileId,
        filename: request.filename,
        mimeType: request.mimeType,
        size: request.totalSize,
        url: finalResult.url,
        uploadedAt: Date.now(),
        processingStatus: "completed",
      };
    } catch (error) {
      console.error(`Large file upload failed: ${uploadId}`, error);

      // Cleanup failed multipart upload
      try {
        await this.storageProvider.abortMultipartUpload(uploadId);
      } catch (cleanupError) {
        console.error("Failed to cleanup aborted upload:", cleanupError);
      }

      throw error;
    }
  }

  private async uploadChunkWithRetry(
    uploadId: string,
    chunkNumber: number,
    chunkData: Buffer,
    maxRetries: number
  ): Promise<ChunkUploadResult> {
    let lastError: Error;

    for (let attempt = 1; attempt <= maxRetries; attempt++) {
      try {
        return await this.storageProvider.uploadChunk(
          uploadId,
          chunkNumber,
          chunkData
        );
      } catch (error) {
        lastError = error;

        if (attempt < maxRetries) {
          const delay = Math.min(1000 * Math.pow(2, attempt), 10000); // Exponential backoff
          console.log(
            `Chunk upload failed (attempt ${attempt}), retrying in ${delay}ms`
          );
          await new Promise((resolve) => setTimeout(resolve, delay));
        }
      }
    }

    throw new FileUploadError(
      `Chunk upload failed after ${maxRetries} attempts`,
      "CHUNK_UPLOAD_FAILED",
      500,
      { chunkNumber, originalError: lastError }
    );
  }

  // File validation implementations
  private getMaxFileSize(fileType: string): number {
    const limits: Record<string, number> = {
      image: 10 * 1024 * 1024, // 10MB for images
      video: 500 * 1024 * 1024, // 500MB for videos
      audio: 50 * 1024 * 1024, // 50MB for audio
      document: 25 * 1024 * 1024, // 25MB for documents
      archive: 100 * 1024 * 1024, // 100MB for archives
    };

    return limits[fileType] || 10 * 1024 * 1024; // Default 10MB
  }

  private isAllowedFileType(mimeType: string, category: string): boolean {
    const allowedTypes: Record<string, string[]> = {
      image: [
        "image/jpeg",
        "image/png",
        "image/gif",
        "image/webp",
        "image/svg+xml",
        "image/bmp",
        "image/tiff",
      ],
      video: [
        "video/mp4",
        "video/webm",
        "video/ogg",
        "video/avi",
        "video/mov",
        "video/wmv",
        "video/flv",
        "video/mkv",
      ],
      audio: [
        "audio/mpeg",
        "audio/wav",
        "audio/ogg",
        "audio/aac",
        "audio/flac",
        "audio/m4a",
      ],
      document: [
        "application/pdf",
        "text/plain",
        "text/csv",
        "application/msword",
        "application/vnd.openxmlformats-officedocument.wordprocessingml.document",
        "application/vnd.ms-excel",
        "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
      ],
    };

    return allowedTypes[category]?.includes(mimeType) || false;
  }

  private getAllowedTypes(category: string): string[] {
    const allowedTypes: Record<string, string[]> = {
      image: ["JPEG", "PNG", "GIF", "WebP", "SVG"],
      video: ["MP4", "WebM", "OGG", "AVI", "MOV"],
      audio: ["MP3", "WAV", "OGG", "AAC", "FLAC"],
      document: ["PDF", "TXT", "DOC", "DOCX", "XLS", "XLSX"],
    };

    return allowedTypes[category] || [];
  }

  private sanitizeFilename(filename: string): string {
    // Remove dangerous characters and normalize
    return filename
      .replace(/[^\w\s.-]/g, "") // Remove special characters
      .replace(/\s+/g, "_") // Replace spaces with underscores
      .toLowerCase()
      .substring(0, 100); // Limit length
  }

  private generateStorageKey(
    metadata: FileMetadata,
    options?: UploadOptions
  ): string {
    const date = new Date();
    const year = date.getFullYear();
    const month = String(date.getMonth() + 1).padStart(2, "0");
    const day = String(date.getDate()).padStart(2, "0");

    const extension = this.getFileExtension(metadata.originalName);

    return `uploads/${year}/${month}/${day}/${metadata.fileId}${extension}`;
  }

  private getFileExtension(filename: string): string {
    const match = filename.match(/\.[^.]*$/);
    return match ? match[0] : "";
  }

  private generateUploadId(): string {
    return `upload_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
  }

  private generateFileId(): string {
    return `file_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
  }

  private generateBatchId(): string {
    return `batch_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
  }

  // Cleanup and monitoring
  private setupCleanupTasks(): void {
    // Clean up temporary files every hour
    setInterval(async () => {
      try {
        await this.cleanupExpiredTemporaryFiles();
      } catch (error) {
        console.error("Temporary file cleanup failed:", error);
      }
    }, 3600000); // 1 hour
  }

  private setupMonitoring(): void {
    // Monitor upload metrics
    setInterval(async () => {
      try {
        const metrics = await this.uploadMetrics.getRecentMetrics();
        console.log("Upload metrics:", metrics);
      } catch (error) {
        console.error("Failed to collect upload metrics:", error);
      }
    }, 300000); // 5 minutes
  }

  // Utility methods would be implemented based on your specific requirements
  private async createTemporaryFile(
    request: FileUploadRequest,
    uploadId: string
  ): Promise<TemporaryFile> {
    // Implementation depends on your temporary storage solution
    throw new Error("createTemporaryFile implementation needed");
  }

  private async cleanupTemporaryFile(tempFile: TemporaryFile): Promise<void> {
    // Implementation for cleaning up temporary files
  }

  private async getBasicFileMetadata(
    tempFile: TemporaryFile
  ): Promise<BasicFileMetadata> {
    // Implementation for extracting basic metadata
    throw new Error("getBasicFileMetadata implementation needed");
  }

  private requiresProcessing(metadata: FileMetadata): boolean {
    // Determine if file needs async processing (thumbnails, transcoding, etc.)
    const processingRequired = ["image/", "video/", "audio/"];
    return processingRequired.some((type) =>
      metadata.mimeType.startsWith(type)
    );
  }

  private async queueForProcessing(
    storageResult: StorageResult,
    metadata: FileMetadata,
    options: UploadOptions
  ): Promise<void> {
    await this.processingQueue.enqueue({
      fileId: metadata.fileId,
      storageKey: storageResult.storageKey,
      mimeType: metadata.mimeType,
      processingOptions: options.processing || {},
    });
  }
}

// File validator implementations
class MimeTypeValidator implements FileValidator {
  name = "mime_type";

  async shouldValidate(
    file: TemporaryFile,
    request: FileUploadRequest
  ): Promise<boolean> {
    return true; // Always validate MIME type
  }

  async validate(
    file: TemporaryFile,
    request: FileUploadRequest
  ): Promise<ValidationStepResult> {
    // Use file-type library or similar to detect actual MIME type
    const detectedType = await this.detectMimeType(file.path);
    const declaredType = request.mimeType;

    if (detectedType !== declaredType) {
      return {
        validatorName: this.name,
        isValid: false,
        severity: "critical",
        errors: [
          `MIME type mismatch: declared ${declaredType}, detected ${detectedType}`,
        ],
      };
    }

    return {
      validatorName: this.name,
      isValid: true,
      severity: "info",
      errors: [],
    };
  }

  private async detectMimeType(filePath: string): Promise<string> {
    // Implementation would use a library like file-type
    return "application/octet-stream"; // Placeholder
  }
}

class ImageValidator implements FileValidator {
  name = "image";

  async shouldValidate(
    file: TemporaryFile,
    request: FileUploadRequest
  ): Promise<boolean> {
    return request.mimeType.startsWith("image/");
  }

  async validate(
    file: TemporaryFile,
    request: FileUploadRequest
  ): Promise<ValidationStepResult> {
    try {
      // Use sharp or similar library to validate image
      const imageInfo = await this.getImageInfo(file.path);

      // Check for reasonable dimensions
      if (imageInfo.width > 50000 || imageInfo.height > 50000) {
        return {
          validatorName: this.name,
          isValid: false,
          severity: "critical",
          errors: ["Image dimensions too large"],
        };
      }

      // Check for corrupted image
      if (!imageInfo.valid) {
        return {
          validatorName: this.name,
          isValid: false,
          severity: "critical",
          errors: ["Corrupted image file"],
        };
      }

      return {
        validatorName: this.name,
        isValid: true,
        severity: "info",
        errors: [],
      };
    } catch (error) {
      return {
        validatorName: this.name,
        isValid: false,
        severity: "critical",
        errors: [`Image validation failed: ${error.message}`],
      };
    }
  }

  private async getImageInfo(filePath: string): Promise<any> {
    // Implementation would use sharp or similar
    return { width: 0, height: 0, valid: true };
  }
}

// Supporting interfaces and types
interface FileUploadRequest {
  filename: string;
  mimeType: string;
  size: number;
  userId: string;
  fileType: string;
  data?: Buffer | ReadableStream;
}

interface UploadOptions {
  ttl?: number;
  encrypt?: boolean;
  redundancy?: "standard" | "high";
  processing?: ProcessingOptions;
  batchId?: string;
  batchIndex?: number;
}

interface ProcessingOptions {
  generateThumbnails?: boolean;
  optimizeImages?: boolean;
  transcodeVideo?: boolean;
  extractMetadata?: boolean;
}

interface FileUploadResult {
  uploadId: string;
  fileId: string;
  filename: string;
  mimeType: string;
  size: number;
  url: string;
  thumbnailUrl?: string | null;
  metadata: any;
  processingStatus: "queued" | "processing" | "completed" | "failed";
  uploadedAt: number;
  expiresAt?: number | null;
}

interface ValidationResult {
  isValid: boolean;
  errors: string[];
  warnings: string[];
  validationSteps: ValidationStepResult[];
}

interface ValidationStepResult {
  validatorName: string;
  isValid: boolean;
  severity: "info" | "warning" | "critical";
  errors: string[];
}

interface FileValidator {
  name: string;
  shouldValidate(
    file: TemporaryFile,
    request: FileUploadRequest
  ): Promise<boolean>;
  validate(
    file: TemporaryFile,
    request: FileUploadRequest
  ): Promise<ValidationStepResult>;
}

interface TemporaryFile {
  path: string;
  size: number;
  mimeType: string;
  createdAt: number;
}

interface FileMetadata {
  fileId: string;
  originalName: string;
  sanitizedName: string;
  mimeType: string;
  size: number;
  checksum: string;
  uploadedBy: string;
  uploadedAt: number;
  scanResults: any[];
  riskLevel: string;
  [key: string]: any; // Extended metadata
}

interface BasicFileMetadata {
  mimeType: string;
  size: number;
  checksum: string;
}

interface StorageResult {
  fileId: string;
  storageKey: string;
  url: string;
  thumbnailUrl?: string | null;
  size: number;
  etag: string;
}

interface BatchUploadOptions extends UploadOptions {
  concurrency?: number;
  failFast?: boolean;
}

interface BatchUploadResult {
  batchId: string;
  totalFiles: number;
  successfulUploads: number;
  failedUploads: number;
  results: FileUploadResult[];
  errors: BatchUploadError[];
  completedAt: number;
}

interface BatchUploadError {
  index: number;
  filename: string;
  error: string;
  code: string;
}

interface LargeFileUploadRequest {
  filename: string;
  mimeType: string;
  totalSize: number;
  getChunk: (start: number, end: number) => Buffer;
}

interface LargeFileUploadOptions extends UploadOptions {
  chunkSize?: number;
  maxRetries?: number;
  onProgress?: (progress: UploadProgress) => void;
}

interface UploadProgress {
  uploadId: string;
  totalChunks: number;
  completedChunks: number;
  uploadedBytes: number;
  startTime: number;
}

interface ChunkUploadResult {
  chunkNumber: number;
  etag: string;
}

class FileUploadError extends Error {
  constructor(
    message: string,
    public code: string,
    public status: number,
    public metadata?: any
  ) {
    super(message);
    this.name = "FileUploadError";
  }
}

// Placeholder classes that would need full implementation
interface FileUploadConfig {
  storage: any;
  validation: any;
  processing: any;
  security: any;
  virusScanning: any;
}

interface StorageProvider {
  upload(filePath: string, key: string, options: any): Promise<any>;
  createMultipartUpload(options: any): Promise<any>;
  uploadChunk(
    uploadId: string,
    chunkNumber: number,
    data: Buffer
  ): Promise<ChunkUploadResult>;
  completeMultipartUpload(
    uploadId: string,
    parts: ChunkUploadResult[]
  ): Promise<any>;
  abortMultipartUpload(uploadId: string): Promise<void>;
}

class FileValidationEngine {
  constructor(config: any) {}
}

class MediaProcessingQueue {
  constructor(config: any) {}
  async enqueue(job: any): Promise<void> {}
}

class FileSecurityScanner {
  constructor(config: any) {}
  async verifyFileSignature(path: string, mimeType: string): Promise<boolean> {
    return true;
  }
  async containsExecutableContent(path: string): Promise<boolean> {
    return false;
  }
}

class FileMetadataExtractor {
  async extract(path: string, mimeType: string): Promise<any> {
    return {};
  }
}

class UploadMetrics {
  async getRecentMetrics(): Promise<any> {
    return {};
  }
}

class VirusScanner {
  constructor(config: any) {}
  async scan(
    path: string
  ): Promise<{ infected: boolean; clean: boolean; threats?: string[] }> {
    return { infected: false, clean: true };
  }
}

Image Processing and Optimization: Making Your Images Web-Ready

Advanced Image Processing Pipeline

// Comprehensive image processing system with optimization and format conversion
class ImageProcessingEngine {
  private processingQueue: Queue<ImageJob>;
  private storageProvider: StorageProvider;
  private cdn: CDNProvider;
  private cache: ProcessingCache;
  private optimizer: ImageOptimizer;
  private transformer: ImageTransformer;

  constructor(config: ImageProcessingConfig) {
    this.processingQueue = new Queue("image-processing", config.queueConfig);
    this.storageProvider = new StorageProvider(config.storage);
    this.cdn = new CDNProvider(config.cdn);
    this.cache = new ProcessingCache(config.cache);
    this.optimizer = new ImageOptimizer(config.optimization);
    this.transformer = new ImageTransformer(config.transformation);

    this.setupProcessingWorkers(config.workerCount || 4);
  }

  // Process uploaded image with multiple optimizations
  async processImage(
    imageJob: ImageProcessingJob
  ): Promise<ImageProcessingResult> {
    const jobId = this.generateJobId();
    const startTime = Date.now();

    try {
      console.log(`Starting image processing: ${jobId}`);

      // Download source image
      const sourceImage = await this.downloadImage(imageJob.sourceUrl);

      // Extract image metadata
      const metadata = await this.extractImageMetadata(sourceImage);

      // Generate multiple variants
      const variants = await this.generateImageVariants(
        sourceImage,
        metadata,
        imageJob.variants
      );

      // Optimize each variant
      const optimizedVariants = await Promise.all(
        variants.map((variant) =>
          this.optimizeImageVariant(variant, imageJob.optimization)
        )
      );

      // Upload variants to storage
      const uploadResults = await this.uploadImageVariants(
        optimizedVariants,
        imageJob.destinationPath
      );

      // Update CDN cache
      await this.updateCDNCache(uploadResults);

      const result: ImageProcessingResult = {
        jobId,
        originalUrl: imageJob.sourceUrl,
        variants: uploadResults.map((upload) => ({
          size: upload.variant.size,
          url: upload.url,
          width: upload.variant.width,
          height: upload.variant.height,
          format: upload.variant.format,
          fileSize: upload.fileSize,
          optimizationSavings: upload.optimizationSavings,
        })),
        metadata,
        processingTime: Date.now() - startTime,
        processedAt: Date.now(),
      };

      console.log(`Image processing completed: ${jobId}`);
      return result;
    } catch (error) {
      console.error(`Image processing failed: ${jobId}`, error);
      throw new ImageProcessingError(
        "Image processing failed",
        "PROCESSING_FAILED",
        500,
        { jobId, originalError: error }
      );
    }
  }

  private async generateImageVariants(
    sourceImage: ProcessedImage,
    metadata: ImageMetadata,
    variantSpecs: ImageVariantSpec[]
  ): Promise<ProcessedImage[]> {
    const variants: ProcessedImage[] = [];

    for (const spec of variantSpecs) {
      try {
        console.log(`Generating variant: ${spec.name}`);

        let variant = sourceImage;

        // Resize if needed
        if (spec.width || spec.height) {
          variant = await this.transformer.resize(variant, {
            width: spec.width,
            height: spec.height,
            fit: spec.fit || "cover",
            position: spec.position || "center",
            withoutEnlargement: spec.withoutEnlargement !== false,
          });
        }

        // Apply transformations
        if (spec.transformations) {
          for (const transform of spec.transformations) {
            switch (transform.type) {
              case "crop":
                variant = await this.transformer.crop(
                  variant,
                  transform.options
                );
                break;
              case "rotate":
                variant = await this.transformer.rotate(
                  variant,
                  transform.options
                );
                break;
              case "blur":
                variant = await this.transformer.blur(
                  variant,
                  transform.options
                );
                break;
              case "sharpen":
                variant = await this.transformer.sharpen(
                  variant,
                  transform.options
                );
                break;
              case "grayscale":
                variant = await this.transformer.grayscale(variant);
                break;
              case "sepia":
                variant = await this.transformer.sepia(variant);
                break;
              case "watermark":
                variant = await this.transformer.addWatermark(
                  variant,
                  transform.options
                );
                break;
            }
          }
        }

        // Convert format if needed
        if (spec.format && spec.format !== metadata.format) {
          variant = await this.transformer.convertFormat(variant, spec.format, {
            quality: spec.quality,
            progressive: spec.progressive,
          });
        }

        // Update variant metadata
        variant.spec = spec;
        variant.name = spec.name;

        variants.push(variant);
      } catch (error) {
        console.error(`Failed to generate variant ${spec.name}:`, error);

        if (spec.required) {
          throw new ImageProcessingError(
            `Failed to generate required variant: ${spec.name}`,
            "VARIANT_GENERATION_FAILED",
            500,
            { variant: spec.name, originalError: error }
          );
        }
      }
    }

    return variants;
  }

  private async optimizeImageVariant(
    variant: ProcessedImage,
    optimization: OptimizationOptions
  ): Promise<OptimizedImage> {
    console.log(`Optimizing variant: ${variant.name}`);

    const originalSize = variant.buffer.length;
    let optimizedBuffer = variant.buffer;
    let optimizationSavings = 0;

    try {
      switch (variant.format) {
        case "jpeg":
          optimizedBuffer = await this.optimizer.optimizeJPEG(variant.buffer, {
            quality: optimization.jpegQuality || 85,
            progressive: optimization.progressive !== false,
            mozjpeg: optimization.mozjpeg !== false,
          });
          break;

        case "png":
          optimizedBuffer = await this.optimizer.optimizePNG(variant.buffer, {
            compressionLevel: optimization.pngCompressionLevel || 6,
            palette: optimization.pngPalette,
            colors: optimization.pngColors,
          });
          break;

        case "webp":
          optimizedBuffer = await this.optimizer.optimizeWebP(variant.buffer, {
            quality: optimization.webpQuality || 80,
            lossless: optimization.webpLossless,
            effort: optimization.webpEffort || 4,
          });
          break;

        case "avif":
          optimizedBuffer = await this.optimizer.optimizeAVIF(variant.buffer, {
            quality: optimization.avifQuality || 75,
            effort: optimization.avifEffort || 4,
          });
          break;
      }

      optimizationSavings =
        ((originalSize - optimizedBuffer.length) / originalSize) * 100;

      console.log(
        `Optimization complete: ${variant.name} - ${optimizationSavings.toFixed(
          1
        )}% savings`
      );
    } catch (error) {
      console.warn(
        `Optimization failed for ${variant.name}, using original:`,
        error
      );
      optimizedBuffer = variant.buffer;
    }

    return {
      ...variant,
      buffer: optimizedBuffer,
      originalSize,
      optimizedSize: optimizedBuffer.length,
      optimizationSavings,
    };
  }

  private async uploadImageVariants(
    variants: OptimizedImage[],
    basePath: string
  ): Promise<VariantUploadResult[]> {
    const uploadPromises = variants.map(async (variant) => {
      const filename = this.generateVariantFilename(basePath, variant);

      const uploadResult = await this.storageProvider.upload(
        variant.buffer,
        filename,
        {
          mimeType: `image/${variant.format}`,
          metadata: {
            width: variant.width,
            height: variant.height,
            originalSize: variant.originalSize,
            optimizedSize: variant.optimizedSize,
            optimizationSavings: variant.optimizationSavings,
          },
          cacheControl: "public, max-age=31536000", // 1 year cache
        }
      );

      return {
        variant,
        url: uploadResult.url,
        fileSize: variant.optimizedSize,
        optimizationSavings: variant.optimizationSavings,
      };
    });

    return await Promise.all(uploadPromises);
  }

  // Smart format selection based on browser support and content
  async selectOptimalFormat(
    sourceImage: ProcessedImage,
    userAgent?: string,
    acceptHeader?: string
  ): Promise<string> {
    const supportedFormats = this.getSupportedFormats(userAgent, acceptHeader);

    // Analyze image content
    const analysis = await this.analyzeImageContent(sourceImage);

    // Decision matrix for format selection
    if (supportedFormats.includes("avif") && analysis.complexity === "high") {
      return "avif"; // Best compression for complex images
    }

    if (supportedFormats.includes("webp")) {
      if (analysis.hasTransparency) {
        return "webp"; // WebP handles transparency better than JPEG
      }
      if (analysis.isPhotographic) {
        return "webp"; // Good balance for photos
      }
    }

    if (analysis.hasTransparency) {
      return "png"; // Fallback for transparency
    }

    if (analysis.isPhotographic) {
      return "jpeg"; // Traditional choice for photos
    }

    return "png"; // Safe fallback
  }

  private getSupportedFormats(
    userAgent?: string,
    acceptHeader?: string
  ): string[] {
    const formats = ["jpeg", "png"]; // Base support

    if (!userAgent && !acceptHeader) {
      return formats;
    }

    // Check Accept header first (most reliable)
    if (acceptHeader) {
      if (acceptHeader.includes("image/avif")) {
        formats.push("avif");
      }
      if (acceptHeader.includes("image/webp")) {
        formats.push("webp");
      }
    }

    // Fallback to User-Agent parsing
    if (userAgent) {
      // Chrome supports WebP and AVIF
      if (userAgent.includes("Chrome/")) {
        const chromeVersion = this.extractChromeVersion(userAgent);
        if (chromeVersion >= 85) formats.push("avif");
        if (chromeVersion >= 23) formats.push("webp");
      }

      // Firefox supports WebP and AVIF
      if (userAgent.includes("Firefox/")) {
        const firefoxVersion = this.extractFirefoxVersion(userAgent);
        if (firefoxVersion >= 93) formats.push("avif");
        if (firefoxVersion >= 65) formats.push("webp");
      }

      // Safari supports WebP in newer versions
      if (userAgent.includes("Safari/")) {
        const safariVersion = this.extractSafariVersion(userAgent);
        if (safariVersion >= 14) formats.push("webp");
      }
    }

    return formats;
  }

  private async analyzeImageContent(
    image: ProcessedImage
  ): Promise<ImageAnalysis> {
    // Use image processing library to analyze content
    const analysis = await this.transformer.analyze(image.buffer);

    return {
      isPhotographic: analysis.entropy > 7, // High entropy suggests photographic content
      hasTransparency: analysis.channels > 3,
      complexity:
        analysis.edges > 1000
          ? "high"
          : analysis.edges > 300
          ? "medium"
          : "low",
      dominantColors: analysis.dominantColors,
      averageColor: analysis.averageColor,
    };
  }

  // Responsive image generation with srcset
  async generateResponsiveImages(
    sourceUrl: string,
    breakpoints: ResponsiveBreakpoint[]
  ): Promise<ResponsiveImageSet> {
    const sourceImage = await this.downloadImage(sourceUrl);
    const metadata = await this.extractImageMetadata(sourceImage);

    // Generate variants for each breakpoint
    const variants: ResponsiveVariant[] = [];

    for (const breakpoint of breakpoints) {
      for (const density of breakpoint.densities || [1, 2]) {
        const width = Math.round(breakpoint.width * density);
        const height = breakpoint.height
          ? Math.round(breakpoint.height * density)
          : undefined;

        // Don't upscale beyond original dimensions
        if (width > metadata.width || (height && height > metadata.height)) {
          continue;
        }

        const variant = await this.transformer.resize(sourceImage, {
          width,
          height,
          fit: breakpoint.fit || "cover",
          withoutEnlargement: true,
        });

        const optimized = await this.optimizeImageVariant(variant, {
          jpegQuality: breakpoint.quality || 85,
          webpQuality: (breakpoint.quality || 85) - 5,
        });

        const filename = this.generateResponsiveFilename(
          sourceUrl,
          width,
          height,
          density,
          optimized.format
        );

        const uploadResult = await this.storageProvider.upload(
          optimized.buffer,
          filename,
          {
            mimeType: `image/${optimized.format}`,
            cacheControl: "public, max-age=31536000",
          }
        );

        variants.push({
          url: uploadResult.url,
          width,
          height: optimized.height,
          density,
          fileSize: optimized.optimizedSize,
          format: optimized.format,
        });
      }
    }

    // Generate srcset string
    const srcset = variants.map((v) => `${v.url} ${v.width}w`).join(", ");

    // Generate sizes string
    const sizes = breakpoints
      .map((bp) =>
        bp.mediaQuery ? `${bp.mediaQuery} ${bp.width}px` : `${bp.width}px`
      )
      .join(", ");

    return {
      srcset,
      sizes,
      variants,
      fallbackUrl: variants[0]?.url,
    };
  }

  // Background processing for bulk image optimization
  async processBulkImages(
    imageJobs: ImageProcessingJob[],
    options: BulkProcessingOptions = {}
  ): Promise<BulkProcessingResult> {
    const batchId = this.generateBatchId();
    const concurrency = options.concurrency || 3;

    console.log(
      `Starting bulk image processing: ${batchId} (${imageJobs.length} images)`
    );

    const results: ImageProcessingResult[] = [];
    const errors: BulkProcessingError[] = [];
    const executing: Promise<any>[] = [];

    for (let i = 0; i < imageJobs.length; i++) {
      const job = { ...imageJobs[i], batchId, batchIndex: i };

      const promise = this.processImage(job).then(
        (result) => {
          results.push(result);
          if (options.onProgress) {
            options.onProgress({
              completed: results.length + errors.length,
              total: imageJobs.length,
              batchId,
            });
          }
          return result;
        },
        (error) => {
          const bulkError: BulkProcessingError = {
            index: i,
            sourceUrl: job.sourceUrl,
            error: error.message,
            code: error.code,
          };
          errors.push(bulkError);
          if (options.onProgress) {
            options.onProgress({
              completed: results.length + errors.length,
              total: imageJobs.length,
              batchId,
            });
          }
          return bulkError;
        }
      );

      executing.push(promise);

      // Control concurrency
      if (executing.length >= concurrency) {
        await Promise.race(executing);
        const completedIndex = executing.findIndex((p) => p === promise);
        if (completedIndex !== -1) {
          executing.splice(completedIndex, 1);
        }
      }
    }

    // Wait for all remaining processing
    await Promise.allSettled(executing);

    return {
      batchId,
      totalImages: imageJobs.length,
      successfulProcessing: results.length,
      failedProcessing: errors.length,
      results,
      errors,
      totalSavings: results.reduce((sum, r) => {
        return (
          sum +
          r.variants.reduce(
            (variantSum, v) => variantSum + v.optimizationSavings,
            0
          )
        );
      }, 0),
      completedAt: Date.now(),
    };
  }

  private generateJobId(): string {
    return `img_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
  }

  private generateBatchId(): string {
    return `batch_img_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
  }

  private generateVariantFilename(
    basePath: string,
    variant: OptimizedImage
  ): string {
    const extension = variant.format;
    const size = variant.spec?.name || `${variant.width}x${variant.height}`;

    return `${basePath}_${size}.${extension}`;
  }

  private generateResponsiveFilename(
    sourceUrl: string,
    width: number,
    height?: number,
    density?: number,
    format?: string
  ): string {
    const basename =
      sourceUrl
        .split("/")
        .pop()
        ?.replace(/\.[^.]*$/, "") || "image";
    const densityStr = density && density > 1 ? `@${density}x` : "";
    const sizeStr = height ? `${width}x${height}` : `${width}w`;

    return `responsive/${basename}_${sizeStr}${densityStr}.${format || "jpg"}`;
  }

  // Utility methods
  private extractChromeVersion(userAgent: string): number {
    const match = userAgent.match(/Chrome\/(\d+)/);
    return match ? parseInt(match[1]) : 0;
  }

  private extractFirefoxVersion(userAgent: string): number {
    const match = userAgent.match(/Firefox\/(\d+)/);
    return match ? parseInt(match[1]) : 0;
  }

  private extractSafariVersion(userAgent: string): number {
    const match = userAgent.match(/Version\/(\d+)/);
    return match ? parseInt(match[1]) : 0;
  }

  // Placeholder implementations
  private async downloadImage(url: string): Promise<ProcessedImage> {
    // Implementation would download and load image
    throw new Error("downloadImage implementation needed");
  }

  private async extractImageMetadata(
    image: ProcessedImage
  ): Promise<ImageMetadata> {
    // Implementation would extract EXIF and other metadata
    throw new Error("extractImageMetadata implementation needed");
  }

  private async updateCDNCache(results: VariantUploadResult[]): Promise<void> {
    // Implementation would invalidate CDN cache
  }

  private setupProcessingWorkers(workerCount: number): void {
    // Implementation would set up background workers
  }
}

// Supporting classes and interfaces
interface ImageProcessingJob {
  sourceUrl: string;
  destinationPath: string;
  variants: ImageVariantSpec[];
  optimization: OptimizationOptions;
  batchId?: string;
  batchIndex?: number;
}

interface ImageVariantSpec {
  name: string;
  width?: number;
  height?: number;
  format?: string;
  quality?: number;
  fit?: "cover" | "contain" | "fill" | "inside" | "outside";
  position?: string;
  withoutEnlargement?: boolean;
  progressive?: boolean;
  required?: boolean;
  transformations?: ImageTransformation[];
}

interface ImageTransformation {
  type:
    | "crop"
    | "rotate"
    | "blur"
    | "sharpen"
    | "grayscale"
    | "sepia"
    | "watermark";
  options: any;
}

interface OptimizationOptions {
  jpegQuality?: number;
  webpQuality?: number;
  avifQuality?: number;
  pngCompressionLevel?: number;
  pngPalette?: boolean;
  pngColors?: number;
  progressive?: boolean;
  mozjpeg?: boolean;
  webpLossless?: boolean;
  webpEffort?: number;
  avifEffort?: number;
}

interface ProcessedImage {
  buffer: Buffer;
  width: number;
  height: number;
  format: string;
  channels: number;
  name?: string;
  spec?: ImageVariantSpec;
}

interface OptimizedImage extends ProcessedImage {
  originalSize: number;
  optimizedSize: number;
  optimizationSavings: number;
}

interface ImageMetadata {
  width: number;
  height: number;
  format: string;
  channels: number;
  density: number;
  hasAlpha: boolean;
  exif?: any;
  icc?: any;
}

interface ImageProcessingResult {
  jobId: string;
  originalUrl: string;
  variants: ProcessingVariant[];
  metadata: ImageMetadata;
  processingTime: number;
  processedAt: number;
}

interface ProcessingVariant {
  size: string;
  url: string;
  width: number;
  height: number;
  format: string;
  fileSize: number;
  optimizationSavings: number;
}

interface VariantUploadResult {
  variant: OptimizedImage;
  url: string;
  fileSize: number;
  optimizationSavings: number;
}

interface ImageAnalysis {
  isPhotographic: boolean;
  hasTransparency: boolean;
  complexity: "low" | "medium" | "high";
  dominantColors: string[];
  averageColor: string;
}

interface ResponsiveBreakpoint {
  width: number;
  height?: number;
  densities?: number[];
  quality?: number;
  fit?: "cover" | "contain";
  mediaQuery?: string;
}

interface ResponsiveVariant {
  url: string;
  width: number;
  height: number;
  density: number;
  fileSize: number;
  format: string;
}

interface ResponsiveImageSet {
  srcset: string;
  sizes: string;
  variants: ResponsiveVariant[];
  fallbackUrl: string;
}

interface BulkProcessingOptions {
  concurrency?: number;
  onProgress?: (progress: {
    completed: number;
    total: number;
    batchId: string;
  }) => void;
}

interface BulkProcessingResult {
  batchId: string;
  totalImages: number;
  successfulProcessing: number;
  failedProcessing: number;
  results: ImageProcessingResult[];
  errors: BulkProcessingError[];
  totalSavings: number;
  completedAt: number;
}

interface BulkProcessingError {
  index: number;
  sourceUrl: string;
  error: string;
  code: string;
}

class ImageProcessingError extends Error {
  constructor(
    message: string,
    public code: string,
    public status: number,
    public metadata?: any
  ) {
    super(message);
    this.name = "ImageProcessingError";
  }
}

// Placeholder classes
interface ImageProcessingConfig {
  queueConfig: any;
  storage: any;
  cdn: any;
  cache: any;
  optimization: any;
  transformation: any;
  workerCount?: number;
}

class Queue<T> {
  constructor(name: string, config: any) {}
}

class StorageProvider {
  constructor(config: any) {}
  async upload(data: Buffer, filename: string, options: any): Promise<any> {
    return { url: `https://example.com/${filename}` };
  }
}

class CDNProvider {
  constructor(config: any) {}
}

class ProcessingCache {
  constructor(config: any) {}
}

class ImageOptimizer {
  constructor(config: any) {}

  async optimizeJPEG(buffer: Buffer, options: any): Promise<Buffer> {
    return buffer; // Placeholder
  }

  async optimizePNG(buffer: Buffer, options: any): Promise<Buffer> {
    return buffer; // Placeholder
  }

  async optimizeWebP(buffer: Buffer, options: any): Promise<Buffer> {
    return buffer; // Placeholder
  }

  async optimizeAVIF(buffer: Buffer, options: any): Promise<Buffer> {
    return buffer; // Placeholder
  }
}

class ImageTransformer {
  constructor(config: any) {}

  async resize(image: ProcessedImage, options: any): Promise<ProcessedImage> {
    return image; // Placeholder
  }

  async crop(image: ProcessedImage, options: any): Promise<ProcessedImage> {
    return image; // Placeholder
  }

  async rotate(image: ProcessedImage, options: any): Promise<ProcessedImage> {
    return image; // Placeholder
  }

  async blur(image: ProcessedImage, options: any): Promise<ProcessedImage> {
    return image; // Placeholder
  }

  async sharpen(image: ProcessedImage, options: any): Promise<ProcessedImage> {
    return image; // Placeholder
  }

  async grayscale(image: ProcessedImage): Promise<ProcessedImage> {
    return image; // Placeholder
  }

  async sepia(image: ProcessedImage): Promise<ProcessedImage> {
    return image; // Placeholder
  }

  async addWatermark(
    image: ProcessedImage,
    options: any
  ): Promise<ProcessedImage> {
    return image; // Placeholder
  }

  async convertFormat(
    image: ProcessedImage,
    format: string,
    options: any
  ): Promise<ProcessedImage> {
    return { ...image, format }; // Placeholder
  }

  async analyze(buffer: Buffer): Promise<any> {
    return {}; // Placeholder
  }
}

Video Processing Basics: Handling Motion Media

Video Processing and Transcoding System

// Advanced video processing system with transcoding and optimization
class VideoProcessingEngine {
  private transcodingQueue: Queue<VideoJob>;
  private storageProvider: StorageProvider;
  private cdn: CDNProvider;
  private ffmpegManager: FFmpegManager;
  private thumbnailGenerator: ThumbnailGenerator;
  private progressTracker: ProgressTracker;

  constructor(config: VideoProcessingConfig) {
    this.transcodingQueue = new Queue("video-transcoding", config.queueConfig);
    this.storageProvider = new StorageProvider(config.storage);
    this.cdn = new CDNProvider(config.cdn);
    this.ffmpegManager = new FFmpegManager(config.ffmpeg);
    this.thumbnailGenerator = new ThumbnailGenerator();
    this.progressTracker = new ProgressTracker();

    this.setupTranscodingWorkers(config.workerCount || 2);
  }

  // Process uploaded video with transcoding and optimization
  async processVideo(
    videoJob: VideoProcessingJob
  ): Promise<VideoProcessingResult> {
    const jobId = this.generateJobId();
    const startTime = Date.now();

    try {
      console.log(`Starting video processing: ${jobId}`);

      // Download and analyze source video
      const sourceVideo = await this.downloadVideo(videoJob.sourceUrl);
      const metadata = await this.extractVideoMetadata(sourceVideo);

      // Validate video file
      await this.validateVideoFile(sourceVideo, metadata);

      // Generate thumbnails
      const thumbnails = await this.generateVideoThumbnails(
        sourceVideo,
        videoJob.thumbnailOptions || {}
      );

      // Transcode to multiple formats/resolutions
      const transcodedVariants = await this.transcodeVideoVariants(
        sourceVideo,
        metadata,
        videoJob.transcodingOptions
      );

      // Upload variants to storage
      const uploadResults = await this.uploadVideoVariants(
        transcodedVariants,
        thumbnails,
        videoJob.destinationPath
      );

      // Generate HLS playlist for adaptive streaming
      const hlsPlaylist = videoJob.generateHLS
        ? await this.generateHLSPlaylist(uploadResults.variants)
        : null;

      // Update CDN and clear cache
      await this.updateCDNCache(uploadResults);

      const result: VideoProcessingResult = {
        jobId,
        originalUrl: videoJob.sourceUrl,
        originalMetadata: metadata,
        variants: uploadResults.variants.map((v) => ({
          resolution: v.resolution,
          bitrate: v.bitrate,
          format: v.format,
          url: v.url,
          fileSize: v.fileSize,
          duration: v.duration,
        })),
        thumbnails: uploadResults.thumbnails.map((t) => ({
          timestamp: t.timestamp,
          url: t.url,
          width: t.width,
          height: t.height,
        })),
        hlsPlaylistUrl: hlsPlaylist?.masterPlaylistUrl,
        processingTime: Date.now() - startTime,
        processedAt: Date.now(),
      };

      console.log(`Video processing completed: ${jobId}`);
      return result;
    } catch (error) {
      console.error(`Video processing failed: ${jobId}`, error);
      throw new VideoProcessingError(
        "Video processing failed",
        "PROCESSING_FAILED",
        500,
        { jobId, originalError: error }
      );
    }
  }

  private async transcodeVideoVariants(
    sourceVideo: SourceVideo,
    metadata: VideoMetadata,
    options: TranscodingOptions[]
  ): Promise<TranscodedVariant[]> {
    const variants: TranscodedVariant[] = [];

    for (const option of options) {
      try {
        console.log(`Transcoding variant: ${option.name}`);

        // Determine output resolution
        const outputResolution = this.calculateOutputResolution(
          metadata.width,
          metadata.height,
          option.resolution
        );

        // Skip if upscaling is disabled and target is larger than source
        if (
          option.noUpscale &&
          (outputResolution.width > metadata.width ||
            outputResolution.height > metadata.height)
        ) {
          console.log(`Skipping upscale for ${option.name}`);
          continue;
        }

        // Build FFmpeg command
        const ffmpegCommand = this.buildTranscodingCommand(
          sourceVideo.path,
          outputResolution,
          option
        );

        // Execute transcoding with progress tracking
        const outputPath = await this.executeTranscoding(
          ffmpegCommand,
          (jobId) =>
            this.progressTracker.updateProgress(
              jobId,
              "transcoding",
              option.name
            )
        );

        // Get transcoded file info
        const transcodedMetadata = await this.extractVideoMetadata({
          path: outputPath,
        });

        variants.push({
          name: option.name,
          path: outputPath,
          resolution: `${outputResolution.width}x${outputResolution.height}`,
          bitrate: option.bitrate,
          format: option.format,
          codec: option.videoCodec,
          audioCodec: option.audioCodec,
          duration: transcodedMetadata.duration,
          fileSize: transcodedMetadata.fileSize,
          width: outputResolution.width,
          height: outputResolution.height,
        });
      } catch (error) {
        console.error(`Failed to transcode variant ${option.name}:`, error);

        if (option.required) {
          throw new VideoProcessingError(
            `Failed to transcode required variant: ${option.name}`,
            "TRANSCODING_FAILED",
            500,
            { variant: option.name, originalError: error }
          );
        }
      }
    }

    return variants;
  }

  private buildTranscodingCommand(
    inputPath: string,
    resolution: { width: number; height: number },
    options: TranscodingOptions
  ): string[] {
    const args: string[] = [
      "-i",
      inputPath,
      "-y", // Overwrite output file
    ];

    // Video encoding options
    if (options.videoCodec) {
      args.push("-c:v", options.videoCodec);
    }

    // Resolution
    args.push("-s", `${resolution.width}x${resolution.height}`);

    // Bitrate
    if (options.bitrate) {
      args.push("-b:v", `${options.bitrate}k`);
    }

    // Frame rate
    if (options.frameRate) {
      args.push("-r", options.frameRate.toString());
    }

    // Audio encoding options
    if (options.audioCodec) {
      args.push("-c:a", options.audioCodec);
    }

    if (options.audioBitrate) {
      args.push("-b:a", `${options.audioBitrate}k`);
    }

    // Format-specific options
    switch (options.format) {
      case "mp4":
        args.push(
          "-movflags",
          "+faststart", // Enable fast start for web playback
          "-preset",
          options.preset || "medium",
          "-crf",
          (options.crf || 23).toString()
        );
        break;

      case "webm":
        args.push("-deadline", "good", "-cpu-used", "2");
        break;

      case "hls":
        args.push(
          "-hls_time",
          "10",
          "-hls_list_size",
          "0",
          "-hls_segment_type",
          "mpegts"
        );
        break;
    }

    // Quality options
    if (options.twoPass) {
      // First pass
      args.push(
        "-pass",
        "1",
        "-an", // No audio for first pass
        "-f",
        "null"
      );
    }

    // Filters
    const filters: string[] = [];

    if (options.deinterlace) {
      filters.push("yadif");
    }

    if (options.denoise) {
      filters.push("hqdn3d");
    }

    if (filters.length > 0) {
      args.push("-vf", filters.join(","));
    }

    return args;
  }

  private async executeTranscoding(
    command: string[],
    progressCallback?: (progress: number) => void
  ): Promise<string> {
    return new Promise((resolve, reject) => {
      const outputPath = `/tmp/transcoded_${Date.now()}_${Math.random()
        .toString(36)
        .substr(2, 9)}`;

      const ffmpeg = this.ffmpegManager.spawn([...command, outputPath]);

      let stderr = "";

      ffmpeg.stderr?.on("data", (data) => {
        stderr += data.toString();

        // Parse progress from FFmpeg stderr
        if (progressCallback) {
          const progressMatch = stderr.match(/time=(\d+):(\d+):(\d+\.\d+)/);
          if (progressMatch) {
            const hours = parseInt(progressMatch[1]);
            const minutes = parseInt(progressMatch[2]);
            const seconds = parseFloat(progressMatch[3]);
            const currentTime = hours * 3600 + minutes * 60 + seconds;

            // Calculate progress percentage (would need total duration)
            progressCallback(currentTime);
          }
        }
      });

      ffmpeg.on("close", (code) => {
        if (code === 0) {
          resolve(outputPath);
        } else {
          reject(
            new Error(`FFmpeg process exited with code ${code}: ${stderr}`)
          );
        }
      });

      ffmpeg.on("error", reject);
    });
  }

  private calculateOutputResolution(
    sourceWidth: number,
    sourceHeight: number,
    targetResolution: string
  ): { width: number; height: number } {
    const aspectRatio = sourceWidth / sourceHeight;

    switch (targetResolution) {
      case "240p":
        return aspectRatio > 1
          ? { width: Math.round(240 * aspectRatio), height: 240 }
          : { width: 240, height: Math.round(240 / aspectRatio) };

      case "360p":
        return aspectRatio > 1
          ? { width: Math.round(360 * aspectRatio), height: 360 }
          : { width: 360, height: Math.round(360 / aspectRatio) };

      case "480p":
        return aspectRatio > 1
          ? { width: Math.round(480 * aspectRatio), height: 480 }
          : { width: 480, height: Math.round(480 / aspectRatio) };

      case "720p":
        return { width: 1280, height: 720 };

      case "1080p":
        return { width: 1920, height: 1080 };

      case "1440p":
        return { width: 2560, height: 1440 };

      case "4k":
        return { width: 3840, height: 2160 };

      default:
        // Parse custom resolution like "800x600"
        const match = targetResolution.match(/^(\d+)x(\d+)$/);
        if (match) {
          return { width: parseInt(match[1]), height: parseInt(match[2]) };
        }

        // Fallback to source resolution
        return { width: sourceWidth, height: sourceHeight };
    }
  }

  // Generate video thumbnails at various timestamps
  private async generateVideoThumbnails(
    sourceVideo: SourceVideo,
    options: ThumbnailOptions
  ): Promise<VideoThumbnail[]> {
    const thumbnails: VideoThumbnail[] = [];
    const timestamps = options.timestamps || [0, 0.25, 0.5, 0.75]; // Percentages of video duration

    const metadata = await this.extractVideoMetadata(sourceVideo);

    for (const timestamp of timestamps) {
      try {
        const timeInSeconds =
          timestamp <= 1 ? timestamp * metadata.duration : timestamp;

        console.log(`Generating thumbnail at ${timeInSeconds}s`);

        const thumbnailPath = await this.extractVideoFrame(
          sourceVideo.path,
          timeInSeconds,
          options
        );

        // Optimize thumbnail image
        const optimizedThumbnail = await this.optimizeThumbnail(
          thumbnailPath,
          options
        );

        thumbnails.push({
          timestamp: timeInSeconds,
          path: optimizedThumbnail.path,
          width: optimizedThumbnail.width,
          height: optimizedThumbnail.height,
          fileSize: optimizedThumbnail.fileSize,
        });
      } catch (error) {
        console.error(`Failed to generate thumbnail at ${timestamp}:`, error);
      }
    }

    return thumbnails;
  }

  private async extractVideoFrame(
    inputPath: string,
    timestamp: number,
    options: ThumbnailOptions
  ): Promise<string> {
    const outputPath = `/tmp/thumb_${Date.now()}_${Math.random()
      .toString(36)
      .substr(2, 9)}.jpg`;

    const command = [
      "-i",
      inputPath,
      "-ss",
      timestamp.toString(),
      "-vframes",
      "1",
      "-q:v",
      "2", // High quality
      "-y",
      outputPath,
    ];

    // Add size constraints if specified
    if (options.width || options.height) {
      const scale =
        options.width && options.height
          ? `${options.width}x${options.height}`
          : options.width
          ? `${options.width}:-1`
          : `-1:${options.height}`;

      command.splice(-2, 0, "-vf", `scale=${scale}`);
    }

    await this.executeFFmpegCommand(command);
    return outputPath;
  }

  private async executeFFmpegCommand(command: string[]): Promise<void> {
    return new Promise((resolve, reject) => {
      const ffmpeg = this.ffmpegManager.spawn(command);

      let stderr = "";

      ffmpeg.stderr?.on("data", (data) => {
        stderr += data.toString();
      });

      ffmpeg.on("close", (code) => {
        if (code === 0) {
          resolve();
        } else {
          reject(
            new Error(`FFmpeg command failed with code ${code}: ${stderr}`)
          );
        }
      });

      ffmpeg.on("error", reject);
    });
  }

  // Generate HLS playlist for adaptive bitrate streaming
  private async generateHLSPlaylist(
    variants: TranscodedVariant[]
  ): Promise<HLSPlaylist> {
    const playlistEntries: HLSVariant[] = [];

    for (const variant of variants) {
      if (variant.format === "hls") {
        // Generate individual playlist for this variant
        const playlistContent = this.generateM3U8Playlist(variant);
        const playlistPath = `${variant.path}.m3u8`;

        await this.writePlaylistFile(playlistPath, playlistContent);

        playlistEntries.push({
          bandwidth: variant.bitrate * 1000,
          resolution: variant.resolution,
          codecs: `avc1.64001e,mp4a.40.2`, // H.264 + AAC
          playlistUrl: playlistPath,
        });
      }
    }

    // Generate master playlist
    const masterPlaylist = this.generateMasterPlaylist(playlistEntries);
    const masterPlaylistPath = `/tmp/master_${Date.now()}.m3u8`;

    await this.writePlaylistFile(masterPlaylistPath, masterPlaylist);

    return {
      masterPlaylistUrl: masterPlaylistPath,
      variants: playlistEntries,
    };
  }

  private generateMasterPlaylist(variants: HLSVariant[]): string {
    let content = "#EXTM3U\n#EXT-X-VERSION:3\n\n";

    for (const variant of variants) {
      content += `#EXT-X-STREAM-INF:BANDWIDTH=${variant.bandwidth},RESOLUTION=${variant.resolution},CODECS="${variant.codecs}"\n`;
      content += `${variant.playlistUrl}\n\n`;
    }

    return content;
  }

  private generateM3U8Playlist(variant: TranscodedVariant): string {
    // This is a simplified version - real implementation would need segment information
    return `#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:10
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:10.0,
segment000.ts
#EXTINF:10.0,
segment001.ts
#EXT-X-ENDLIST
`;
  }

  // Utility methods
  private async downloadVideo(url: string): Promise<SourceVideo> {
    // Implementation would download video file
    throw new Error("downloadVideo implementation needed");
  }

  private async extractVideoMetadata(
    video: SourceVideo
  ): Promise<VideoMetadata> {
    // Implementation would use FFprobe to extract metadata
    throw new Error("extractVideoMetadata implementation needed");
  }

  private async validateVideoFile(
    video: SourceVideo,
    metadata: VideoMetadata
  ): Promise<void> {
    // Validate video file format, duration, etc.
    if (metadata.duration > 3600) {
      // 1 hour limit
      throw new VideoProcessingError(
        "Video duration exceeds maximum allowed",
        "DURATION_TOO_LONG",
        400
      );
    }
  }

  private async optimizeThumbnail(
    thumbnailPath: string,
    options: ThumbnailOptions
  ): Promise<any> {
    // Implementation would optimize thumbnail image
    return { path: thumbnailPath, width: 640, height: 360, fileSize: 50000 };
  }

  private async uploadVideoVariants(
    variants: TranscodedVariant[],
    thumbnails: VideoThumbnail[],
    basePath: string
  ): Promise<any> {
    // Implementation would upload all variants and thumbnails
    return { variants: [], thumbnails: [] };
  }

  private async writePlaylistFile(
    path: string,
    content: string
  ): Promise<void> {
    // Implementation would write playlist file
  }

  private async updateCDNCache(uploadResults: any): Promise<void> {
    // Implementation would update CDN cache
  }

  private generateJobId(): string {
    return `video_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
  }

  private setupTranscodingWorkers(workerCount: number): void {
    // Implementation would set up background workers for transcoding
  }
}

// Supporting interfaces and types
interface VideoProcessingJob {
  sourceUrl: string;
  destinationPath: string;
  transcodingOptions: TranscodingOptions[];
  thumbnailOptions?: ThumbnailOptions;
  generateHLS?: boolean;
}

interface TranscodingOptions {
  name: string;
  resolution: string;
  bitrate: number;
  format: string;
  videoCodec?: string;
  audioCodec?: string;
  audioBitrate?: number;
  frameRate?: number;
  preset?: string;
  crf?: number;
  twoPass?: boolean;
  deinterlace?: boolean;
  denoise?: boolean;
  noUpscale?: boolean;
  required?: boolean;
}

interface ThumbnailOptions {
  timestamps?: number[]; // Either percentages (0-1) or absolute seconds
  width?: number;
  height?: number;
  format?: "jpg" | "png" | "webp";
  quality?: number;
}

interface SourceVideo {
  path: string;
  size: number;
}

interface VideoMetadata {
  duration: number;
  width: number;
  height: number;
  frameRate: number;
  bitrate: number;
  format: string;
  videoCodec: string;
  audioCodec: string;
  fileSize: number;
}

interface TranscodedVariant {
  name: string;
  path: string;
  resolution: string;
  bitrate: number;
  format: string;
  codec: string;
  audioCodec: string;
  duration: number;
  fileSize: number;
  width: number;
  height: number;
}

interface VideoThumbnail {
  timestamp: number;
  path: string;
  width: number;
  height: number;
  fileSize: number;
}

interface HLSPlaylist {
  masterPlaylistUrl: string;
  variants: HLSVariant[];
}

interface HLSVariant {
  bandwidth: number;
  resolution: string;
  codecs: string;
  playlistUrl: string;
}

interface VideoProcessingResult {
  jobId: string;
  originalUrl: string;
  originalMetadata: VideoMetadata;
  variants: any[];
  thumbnails: any[];
  hlsPlaylistUrl?: string;
  processingTime: number;
  processedAt: number;
}

class VideoProcessingError extends Error {
  constructor(
    message: string,
    public code: string,
    public status: number,
    public metadata?: any
  ) {
    super(message);
    this.name = "VideoProcessingError";
  }
}

// Placeholder classes
interface VideoProcessingConfig {
  queueConfig: any;
  storage: any;
  cdn: any;
  ffmpeg: any;
  workerCount?: number;
}

class FFmpegManager {
  constructor(config: any) {}

  spawn(args: string[]): any {
    // Implementation would spawn FFmpeg process
    return {
      stderr: { on: () => {} },
      on: () => {},
    };
  }
}

class ThumbnailGenerator {
  constructor() {}
}

class ProgressTracker {
  updateProgress(jobId: string, stage: string, variant: string): void {
    console.log(`Progress update: ${jobId} - ${stage} - ${variant}`);
  }
}

Key Takeaways

File handling and media processing isn’t just about storing and serving files—it’s about building intelligent pipelines that can validate, transform, optimize, and deliver media efficiently while maintaining security, performance, and user experience.

Essential file handling patterns:

  • Comprehensive validation with multi-stage security checks and content analysis
  • Intelligent image processing with format optimization and responsive variants
  • Video transcoding pipelines that generate multiple resolutions and formats
  • Smart storage strategies that balance cost, performance, and accessibility
  • CDN integration for global content delivery and optimization

The production-ready file handling framework:

  • Use streaming uploads for large files to avoid memory exhaustion
  • Implement background processing for CPU-intensive operations like video transcoding
  • Build format detection that doesn’t rely solely on file extensions
  • Design progressive enhancement for image and video delivery
  • Plan storage lifecycle management for cost optimization

File handling best practices:

  • Always validate file signatures against declared MIME types
  • Never trust user-provided filenames or metadata
  • Implement virus scanning for user-uploaded content
  • Use temporary storage for processing and cleanup after completion
  • Generate multiple variants for different device capabilities and network conditions

The architecture decision framework:

  • Use local storage for temporary processing files and small-scale applications
  • Use cloud storage for scalability, redundancy, and global distribution
  • Use CDN delivery for static assets and optimized media serving
  • Use background queues for time-intensive processing operations
  • Use streaming processing for large files to maintain memory efficiency

What’s Next?

In the next blog, we’ll complete our file handling journey by diving into large file handling and streaming, metadata extraction and management, file security and access control, backup and archiving strategies, and media transcoding and formats.

We’ll explore the operational challenges of managing petabytes of user-generated content—from handling 4K video uploads without bringing down your servers to building backup systems that can restore terabytes of data in minutes when disaster strikes.

Because handling a few test images is straightforward. Building file systems that can process millions of uploads per day while maintaining security, performance, and cost efficiency—that’s where file handling becomes mission-critical infrastructure.