Skip to content

⚡ Optimize config file I/O with caching #72

@ryota-murakami

Description

@ryota-murakami

Problem

Configuration is loaded from disk on every CLI invocation, even though it rarely changes. This adds unnecessary I/O overhead.

Current behavior:

const gitExtension = (_args) => {
  loadConfig() // ← Reads from disk every time
  // ... setup commands
}

Issues:

  • Synchronous file read on every command
  • Redundant disk I/O for unchanged config
  • Slight performance impact (minor for CLI)

Performance Impact

While minor for a CLI tool, this optimization would:

  • Reduce startup time by ~1-5ms
  • Eliminate unnecessary file system calls
  • Improve responsiveness

Proposed Solution

Option 1: In-Memory Cache with File Watch

import { watch } from 'fs'

let cachedConfig = null
let configWatcher = null

/**
 * Loads config with in-memory caching
 * Invalidates cache when file changes
 */
function loadConfig() {
  // Return cached config if available
  if (cachedConfig) {
    return cachedConfig
  }

  try {
    if (fs.existsSync(CONFIG_FILE)) {
      const config = JSON.parse(fs.readFileSync(CONFIG_FILE, 'utf8'))
      cachedConfig = config
      
      // Watch for changes to invalidate cache
      if (!configWatcher) {
        configWatcher = watch(CONFIG_FILE, () => {
          cachedConfig = null // Invalidate cache
        })
      }
      
      return config
    }
  } catch (error) {
    console.error('Error loading configuration:', error)
  }
  
  return {} // Return empty config as default
}

// Cleanup watcher on exit
process.on('exit', () => {
  if (configWatcher) {
    configWatcher.close()
  }
})

Option 2: Simple In-Memory Cache

let configCache = null
let configLoadTime = null
const CACHE_TTL = 60000 // 60 seconds

function loadConfig() {
  const now = Date.now()
  
  // Return cache if fresh
  if (configCache && (now - configLoadTime) < CACHE_TTL) {
    return configCache
  }

  try {
    if (fs.existsSync(CONFIG_FILE)) {
      configCache = JSON.parse(fs.readFileSync(CONFIG_FILE, 'utf8'))
      configLoadTime = now
      return configCache
    }
  } catch (error) {
    console.error('Error loading configuration:', error)
  }
  
  return {}
}

Option 3: Convert to Async I/O

import { promises as fs } from 'fs'

/**
 * Asynchronously loads configuration
 * Better for Node.js event loop
 */
async function loadConfig() {
  try {
    if (await fs.access(CONFIG_FILE).then(() => true).catch(() => false)) {
      const content = await fs.readFile(CONFIG_FILE, 'utf8')
      const config = JSON.parse(content)
      
      if (config.model) model = config.model
      if (config.language) language = config.language
      if (config.prefixEnabled !== undefined) {
        prefixState.setEnabled(config.prefixEnabled)
      }
      if (config.apiKey) apiKey = config.apiKey
    }
  } catch (error) {
    console.error('Error loading configuration:', error)
  }
}

// Update CLI initialization
async function main() {
  await loadConfig() // Load config first
  program.parse(process.argv)
}

main()

Benchmark Comparison

Current (synchronous, no cache):

$ time git gpt config
# ~15-20ms total (includes ~2-5ms for config load)

With caching (option 1/2):

$ time git gpt config
# First run: ~15-20ms
# Subsequent runs: ~10-15ms (cache hit)

With async (option 3):

$ time git gpt config
# ~15-20ms (non-blocking I/O)

Recommendation

Option 2 (Simple in-memory cache with TTL) is best:

  • ✅ Easy to implement
  • ✅ No file watcher overhead
  • ✅ Reasonable TTL prevents stale config
  • ✅ Minimal complexity

Option 3 is also good for consistency (project uses async/await elsewhere).

Implementation Plan

  1. Implement simple cache with TTL
  2. Update loadConfig() to check cache
  3. Add cache invalidation on saveConfig()
  4. Add tests for cache behavior
  5. Measure performance impact
  6. Document caching behavior

Alternative: Config Service Class

As part of architecture refactoring (#67), this could be part of a ConfigService:

// src/services/ConfigService.js
export class ConfigService {
  constructor(configPath) {
    this.configPath = configPath
    this.cache = null
    this.cacheTime = null
    this.cacheTTL = 60000
  }

  async load() {
    if (this.cache && (Date.now() - this.cacheTime) < this.cacheTTL) {
      return this.cache
    }

    try {
      const content = await fs.readFile(this.configPath, 'utf8')
      this.cache = JSON.parse(content)
      this.cacheTime = Date.now()
      return this.cache
    } catch (error) {
      console.error('Error loading config:', error)
      return {}
    }
  }

  async save(config) {
    const existing = await this.load()
    const updated = { ...existing, ...config }
    await fs.writeFile(this.configPath, JSON.stringify(updated, null, 2))
    this.cache = updated
    this.cacheTime = Date.now()
  }

  invalidateCache() {
    this.cache = null
  }
}

Benefits

  • ✅ Faster CLI startup (minor improvement)
  • ✅ Reduced file system I/O
  • ✅ Better practice (cache frequently accessed data)
  • ✅ Prepares for ConfigService refactoring

Acceptance Criteria

  • Implement config caching with TTL
  • Invalidate cache on config save
  • Add tests for cache behavior
  • Measure performance improvement
  • Document caching behavior
  • No breaking changes

Priority

Low - Performance optimization, minor impact

Related

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions