feat(auth): implement core authentication infrastructure#36
feat(auth): implement core authentication infrastructure#36Patrick-Ehimen merged 1 commit intomainfrom
Conversation
Reviewer's GuideThis PR introduces a full authentication infrastructure for multi-client API key support by adding new auth modules (key handling, caching, rate limiting, request context, service pooling, and manager), extending server configuration with auth and performance defaults, and enabling optional per-request API keys in all MCP tools. ER diagram for new authentication configuration typeserDiagram
AUTHCONFIG {
string defaultApiKey
boolean enablePerRequestAuth
boolean requireAuthentication
CACHECONFIG keyValidationCache
RATELIMITCONFIG rateLimiting
}
CACHECONFIG {
boolean enabled
int maxSize
int ttlSeconds
int cleanupIntervalSeconds
}
RATELIMITCONFIG {
boolean enabled
int requestsPerMinute
int burstLimit
boolean keyBasedLimiting
}
PERFORMANCECONFIG {
int servicePoolSize
int serviceTimeoutMinutes
int concurrentRequestLimit
}
AUTHCONFIG ||--|{ CACHECONFIG : contains
AUTHCONFIG ||--|{ RATELIMITCONFIG : contains
Class diagram for new authentication infrastructureclassDiagram
class AuthManager {
- config: AuthConfig
- cache: KeyValidationCache
- rateLimiter: RateLimiter
+ validateApiKey(apiKey: string): Promise<ValidationResult>
+ getEffectiveApiKey(requestKey?: string): Promise<string>
+ authenticate(requestKey?: string): Promise<AuthenticationResult>
+ sanitizeApiKey(apiKey: string): string
+ isRateLimited(apiKey: string): boolean
+ invalidateKey(apiKey: string): void
+ getCacheStats()
+ getRateLimitStatus(apiKey: string)
+ destroy(): void
}
class KeyValidationCache {
- cache: Map<string, CacheEntry>
- config: CacheConfig
+ get(keyHash: string): ValidationResult | null
+ set(keyHash: string, result: ValidationResult): void
+ invalidate(keyHash: string): void
+ clear(): void
+ getStats()
+ destroy(): void
}
class RateLimiter {
- limits: Map<string, RateLimitEntry>
- config: RateLimitConfig
+ isAllowed(keyHash: string): RateLimitResult
+ recordRequest(keyHash: string): void
+ getStatus(keyHash: string): RateLimitResult
+ reset(keyHash: string): void
+ clear(): void
+ destroy(): void
}
class SecureKeyHandler {
+ hashKey(apiKey: string): string
+ sanitizeForLogs(apiKey: string): string
+ secureCompare(a: string, b: string): boolean
+ clearFromMemory(obj: any, keys: string[]): void
+ isValidFormat(apiKey: string): boolean
}
class RequestContext {
+ apiKey: string
+ keyHash: string
+ service: ILighthouseService
+ toolName: string
+ requestId: string
+ timestamp: Date
+ toLogContext(): LogContext
+ getAge(): number
+ isExpired(timeoutMs: number): boolean
}
class LighthouseServiceFactory {
- services: Map<string, ServiceEntry>
- config: PerformanceConfig
+ createService(apiKey: string): Promise<ILighthouseService>
+ getService(apiKey: string): Promise<ILighthouseService>
+ removeService(apiKey: string): void
+ clear(): void
+ getStats()
+ destroy(): void
}
AuthManager --> KeyValidationCache
AuthManager --> RateLimiter
AuthManager --> SecureKeyHandler
KeyValidationCache --> CacheEntry
RateLimiter --> RateLimitEntry
RequestContext --> ILighthouseService
RequestContext --> LogContext
LighthouseServiceFactory --> ServiceEntry
LighthouseServiceFactory --> ILighthouseService
File-Level Changes
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Hey there - I've reviewed your changes - here's some feedback:
- Extend all tool input parameter interfaces from BaseToolParams instead of manually adding apiKey to each tool to reduce duplication.
- Respect the enablePerRequestAuth flag in AuthManager to conditionally disable per-request key validation when configured.
- Ensure RateLimiter and KeyValidationCache fully honor their configuration options (e.g. burstLimit, keyBasedLimiting) and track cache hit/miss counts so that stats like hitRate are meaningful.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- Extend all tool input parameter interfaces from BaseToolParams instead of manually adding apiKey to each tool to reduce duplication.
- Respect the enablePerRequestAuth flag in AuthManager to conditionally disable per-request key validation when configured.
- Ensure RateLimiter and KeyValidationCache fully honor their configuration options (e.g. burstLimit, keyBasedLimiting) and track cache hit/miss counts so that stats like hitRate are meaningful.
## Individual Comments
### Comment 1
<location> `apps/mcp-server/src/auth/RateLimiter.ts:23` </location>
<code_context>
+ * Check if a request is allowed for the given key
+ */
+ isAllowed(keyHash: string): RateLimitResult {
+ if (!this.config.enabled) {
+ return {
+ allowed: true,
</code_context>
<issue_to_address>
**suggestion (bug_risk):** Rate limiting logic does not enforce burstLimit.
BurstLimit from RateLimitConfig should be enforced to prevent short-term request spikes, not just requestsPerMinute.
Suggested implementation:
```typescript
isAllowed(keyHash: string): RateLimitResult {
if (!this.config.enabled) {
return {
allowed: true,
remaining: Infinity,
resetTime: new Date(Date.now() + 60000),
};
}
const now = Date.now();
const windowStart = now - 60 * 1000; // 1 minute window
// Burst limit enforcement
// Use a 1 second window for burst limit, or make it configurable
const burstWindowMs = this.config.burstLimitWindowMs || 1000;
const burstWindowStart = now - burstWindowMs;
let entry = this.limits.get(keyHash);
if (!entry) {
```
```typescript
let entry = this.limits.get(keyHash);
if (!entry) {
entry = {
count: 0,
timestamps: [],
lastRequest: now,
};
this.limits.set(keyHash, entry);
}
// Remove old timestamps outside burst window
entry.timestamps = entry.timestamps.filter(ts => ts > burstWindowStart);
// Enforce burst limit
if (entry.timestamps.length >= this.config.burstLimit) {
return {
allowed: false,
remaining: 0,
resetTime: new Date(entry.timestamps[0] + burstWindowMs),
reason: 'Burst limit exceeded',
};
}
// Add current timestamp for burst tracking
entry.timestamps.push(now);
```
1. Ensure that `RateLimitConfig` includes `burstLimit` and optionally `burstLimitWindowMs` (default to 1000ms if not set).
2. Make sure that the `entry` object for each key includes a `timestamps: number[]` property for tracking request times.
3. You may need to update the cleanup logic to remove old timestamps from entries.
4. If you want to provide a more specific error message, update the `RateLimitResult` type to include a `reason` field.
</issue_to_address>
### Comment 2
<location> `apps/mcp-server/src/auth/LighthouseServiceFactory.ts:29` </location>
<code_context>
+ const service = new LighthouseService(apiKey);
+
+ // Initialize if the service has an initialize method
+ if (service.initialize) {
+ await service.initialize();
+ }
</code_context>
<issue_to_address>
**suggestion (bug_risk):** Service initialization is conditionally awaited, but errors are not handled.
Handle errors from LighthouseService.initialize to prevent unhandled promise rejections in this factory method.
</issue_to_address>
### Comment 3
<location> `apps/mcp-server/src/auth/KeyValidationCache.ts:81` </location>
<code_context>
+ /**
+ * Get cache statistics
+ */
+ getStats(): {
+ size: number;
+ maxSize: number;
</code_context>
<issue_to_address>
**suggestion:** Cache hit rate is always zero in getStats.
The hitRate property is currently hardcoded. To accurately reflect cache performance, implement tracking for cache hits and misses.
Suggested implementation:
```typescript
private cacheHits: number = 0;
private cacheMisses: number = 0;
/**
* Get cache statistics
*/
getStats(): {
size: number;
maxSize: number;
hitRate: number;
} {
const total = this.cacheHits + this.cacheMisses;
const hitRate = total === 0 ? 0 : this.cacheHits / total;
return {
size: this.cache.size,
maxSize: this.config.maxSize,
hitRate,
};
}
```
You must increment `this.cacheHits` and `this.cacheMisses` in the cache access method (e.g., in your `get(key)` method):
- If the cache contains the key, increment `this.cacheHits`.
- If the cache does not contain the key, increment `this.cacheMisses`.
Example:
```ts
get(key: string): ValueType | undefined {
if (this.cache.has(key)) {
this.cacheHits++;
return this.cache.get(key);
} else {
this.cacheMisses++;
return undefined;
}
}
```
Make sure to add this logic wherever cache accesses occur.
</issue_to_address>
### Comment 4
<location> `apps/mcp-server/src/auth/SecureKeyHandler.ts:33-43` </location>
<code_context>
+ /**
+ * Clear sensitive data from memory
+ */
+ static clearFromMemory(obj: any, keys: string[]): void {
+ keys.forEach((key) => {
+ if (obj[key]) {
</code_context>
<issue_to_address>
**🚨 suggestion (security):** clearFromMemory may not reliably clear sensitive data from memory.
JavaScript's garbage collection may retain data even after properties are set to null or deleted. For sensitive information, consider alternative methods or clearly document this limitation.
```suggestion
/**
* Attempt to clear sensitive data from memory.
*
* WARNING: JavaScript's garbage collection may retain data in memory
* even after properties are set to null or deleted. This method does not guarantee
* immediate or reliable removal of sensitive information from memory.
* For highly sensitive data, consider using secure storage mechanisms outside of JavaScript.
*/
static clearFromMemory(obj: any, keys: string[]): void {
if (process.env.NODE_ENV === "development") {
console.warn(
"[SecureKeyHandler.clearFromMemory] WARNING: JavaScript's garbage collection may retain sensitive data in memory even after clearing. This method does not guarantee secure removal."
);
}
keys.forEach((key) => {
if (obj[key]) {
obj[key] = null;
delete obj[key];
}
});
}
```
</issue_to_address>Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
| * Check if a request is allowed for the given key | ||
| */ | ||
| isAllowed(keyHash: string): RateLimitResult { | ||
| if (!this.config.enabled) { |
There was a problem hiding this comment.
suggestion (bug_risk): Rate limiting logic does not enforce burstLimit.
BurstLimit from RateLimitConfig should be enforced to prevent short-term request spikes, not just requestsPerMinute.
Suggested implementation:
isAllowed(keyHash: string): RateLimitResult {
if (!this.config.enabled) {
return {
allowed: true,
remaining: Infinity,
resetTime: new Date(Date.now() + 60000),
};
}
const now = Date.now();
const windowStart = now - 60 * 1000; // 1 minute window
// Burst limit enforcement
// Use a 1 second window for burst limit, or make it configurable
const burstWindowMs = this.config.burstLimitWindowMs || 1000;
const burstWindowStart = now - burstWindowMs;
let entry = this.limits.get(keyHash);
if (!entry) { let entry = this.limits.get(keyHash);
if (!entry) {
entry = {
count: 0,
timestamps: [],
lastRequest: now,
};
this.limits.set(keyHash, entry);
}
// Remove old timestamps outside burst window
entry.timestamps = entry.timestamps.filter(ts => ts > burstWindowStart);
// Enforce burst limit
if (entry.timestamps.length >= this.config.burstLimit) {
return {
allowed: false,
remaining: 0,
resetTime: new Date(entry.timestamps[0] + burstWindowMs),
reason: 'Burst limit exceeded',
};
}
// Add current timestamp for burst tracking
entry.timestamps.push(now);- Ensure that
RateLimitConfigincludesburstLimitand optionallyburstLimitWindowMs(default to 1000ms if not set). - Make sure that the
entryobject for each key includes atimestamps: number[]property for tracking request times. - You may need to update the cleanup logic to remove old timestamps from entries.
- If you want to provide a more specific error message, update the
RateLimitResulttype to include areasonfield.
| const service = new LighthouseService(apiKey); | ||
|
|
||
| // Initialize if the service has an initialize method | ||
| if (service.initialize) { |
There was a problem hiding this comment.
suggestion (bug_risk): Service initialization is conditionally awaited, but errors are not handled.
Handle errors from LighthouseService.initialize to prevent unhandled promise rejections in this factory method.
| /** | ||
| * Get cache statistics | ||
| */ | ||
| getStats(): { |
There was a problem hiding this comment.
suggestion: Cache hit rate is always zero in getStats.
The hitRate property is currently hardcoded. To accurately reflect cache performance, implement tracking for cache hits and misses.
Suggested implementation:
private cacheHits: number = 0;
private cacheMisses: number = 0;
/**
* Get cache statistics
*/
getStats(): {
size: number;
maxSize: number;
hitRate: number;
} {
const total = this.cacheHits + this.cacheMisses;
const hitRate = total === 0 ? 0 : this.cacheHits / total;
return {
size: this.cache.size,
maxSize: this.config.maxSize,
hitRate,
};
}You must increment this.cacheHits and this.cacheMisses in the cache access method (e.g., in your get(key) method):
- If the cache contains the key, increment
this.cacheHits. - If the cache does not contain the key, increment
this.cacheMisses.
Example:
get(key: string): ValueType | undefined {
if (this.cache.has(key)) {
this.cacheHits++;
return this.cache.get(key);
} else {
this.cacheMisses++;
return undefined;
}
}Make sure to add this logic wherever cache accesses occur.
| /** | ||
| * Clear sensitive data from memory | ||
| */ | ||
| static clearFromMemory(obj: any, keys: string[]): void { | ||
| keys.forEach((key) => { | ||
| if (obj[key]) { | ||
| obj[key] = null; | ||
| delete obj[key]; | ||
| } | ||
| }); | ||
| } |
There was a problem hiding this comment.
🚨 suggestion (security): clearFromMemory may not reliably clear sensitive data from memory.
JavaScript's garbage collection may retain data even after properties are set to null or deleted. For sensitive information, consider alternative methods or clearly document this limitation.
| /** | |
| * Clear sensitive data from memory | |
| */ | |
| static clearFromMemory(obj: any, keys: string[]): void { | |
| keys.forEach((key) => { | |
| if (obj[key]) { | |
| obj[key] = null; | |
| delete obj[key]; | |
| } | |
| }); | |
| } | |
| /** | |
| * Attempt to clear sensitive data from memory. | |
| * | |
| * WARNING: JavaScript's garbage collection may retain data in memory | |
| * even after properties are set to null or deleted. This method does not guarantee | |
| * immediate or reliable removal of sensitive information from memory. | |
| * For highly sensitive data, consider using secure storage mechanisms outside of JavaScript. | |
| */ | |
| static clearFromMemory(obj: any, keys: string[]): void { | |
| if (process.env.NODE_ENV === "development") { | |
| console.warn( | |
| "[SecureKeyHandler.clearFromMemory] WARNING: JavaScript's garbage collection may retain sensitive data in memory even after clearing. This method does not guarantee secure removal." | |
| ); | |
| } | |
| keys.forEach((key) => { | |
| if (obj[key]) { | |
| obj[key] = null; | |
| delete obj[key]; | |
| } | |
| }); | |
| } |
This PR implements the foundational authentication infrastructure to enable multi-client API key support in the Lighthouse MCP server.
New Authentication Components
1. SecureKeyHandler (
src/auth/SecureKeyHandler.ts)2. KeyValidationCache (
src/auth/KeyValidationCache.ts)3. RateLimiter (
src/auth/RateLimiter.ts)4. RequestContext (
src/auth/RequestContext.ts)5. LighthouseServiceFactory (
src/auth/LighthouseServiceFactory.ts)6. AuthManager (
src/auth/AuthManager.ts)Summary by Sourcery
Implement foundational authentication infrastructure with multi-client API key support by introducing secure key utilities, caching, rate limiting, service pooling, and request context management, and integrate authentication and performance configurations into the server and tools
New Features:
Enhancements: