Skip to content

Commit d4abe73

Browse files
fix: add embedder validation to prevent misleading status indicators (RooCodeInc#4398) (RooCodeInc#5404)
* fix: add embedder validation to prevent misleading status indicators (RooCodeInc#4398) * fix: address PR feedback and fix critical issues - Fixed settings-save flow to save before validation - Fixed Error constructor usage in scanner.ts - Fixed segment identification in file-watcher.ts - Added missing translation keys for embedder validation errors * fix: add missing Ollama translation keys - Added missing ollama.title, description, and settings keys - Fixed translation check failure in CI/CD pipeline - Synchronized all 17 non-English locale files * feat: add proactive embedder validation on provider switch - Validate embedder connection when switching providers - Prevent misleading 'Indexed' status when embedder is unavailable - Show immediate error feedback for invalid configurations - Add comprehensive test coverage for validation flow This ensures users get immediate feedback when configuring embedders, preventing confusion when providers like Ollama are not accessible. * fix: improve error handling and validation in code indexing process * refactor: extract common embedder validation and error handling logic - Created shared/validation-helpers.ts with centralized error handling utilities - Refactored OpenAI, OpenAI-Compatible, and Ollama embedders to use shared helpers - Eliminated duplicate error handling code across embedders - Improved maintainability and consistency of error handling - Fixed test compatibility in manager.spec.ts - All 2721 tests passing * refactor: simplify validation helpers by removing unnecessary wrapper functions - Removed getErrorMessageForConnectionError and inlined logic into handleValidationError - Removed isRateLimitError, logRateLimitRetry, and logEmbeddingError wrapper functions - Updated openai.ts and openai-compatible.ts to inline rate limit checking and logging - Reduced code complexity while maintaining all functionality - All 311 tests continue to pass * fix: add missing invalidResponse i18n key and fix French translation - Added missing 'invalidResponse' key to all locale files - Fixed French translation: changed 'and accessible' to 'et accessible' - Ensures proper error messages are displayed when embedder returns invalid responses * fix: restore removed score settings in webviewMessageHandler - Restored codebaseIndexSearchMaxResults and codebaseIndexSearchMinScore settings that were unintentionally removed - Keep embedder validation related changes * fix: revert unintended changes to file-watcher and scanner - Reverted point ID generation back to using line numbers instead of segmentHash - Restored { cause: deleteError } parameter in scanner error handling - These changes were unrelated to the embedder validation feature --------- Co-authored-by: Daniel Riccio <[email protected]>
1 parent 7645aad commit d4abe73

34 files changed

+1798
-116
lines changed

src/core/webview/webviewMessageHandler.ts

Lines changed: 60 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -1843,8 +1843,12 @@ export const webviewMessageHandler = async (
18431843
const settings = message.codeIndexSettings
18441844

18451845
try {
1846-
// Save global state settings atomically (without codebaseIndexEnabled which is now in global settings)
1846+
// Check if embedder provider has changed
18471847
const currentConfig = getGlobalState("codebaseIndexConfig") || {}
1848+
const embedderProviderChanged =
1849+
currentConfig.codebaseIndexEmbedderProvider !== settings.codebaseIndexEmbedderProvider
1850+
1851+
// Save global state settings atomically (without codebaseIndexEnabled which is now in global settings)
18481852
const globalStateConfig = {
18491853
...currentConfig,
18501854
codebaseIndexQdrantUrl: settings.codebaseIndexQdrantUrl,
@@ -1880,31 +1884,70 @@ export const webviewMessageHandler = async (
18801884
)
18811885
}
18821886

1883-
// Verify secrets are actually stored
1884-
const storedOpenAiKey = provider.contextProxy.getSecret("codeIndexOpenAiKey")
1887+
// Send success response first - settings are saved regardless of validation
1888+
await provider.postMessageToWebview({
1889+
type: "codeIndexSettingsSaved",
1890+
success: true,
1891+
settings: globalStateConfig,
1892+
})
18851893

1886-
// Notify code index manager of changes
1894+
// Update webview state
1895+
await provider.postStateToWebview()
1896+
1897+
// Then handle validation and initialization
18871898
if (provider.codeIndexManager) {
1888-
await provider.codeIndexManager.handleSettingsChange()
1899+
// If embedder provider changed, perform proactive validation
1900+
if (embedderProviderChanged) {
1901+
try {
1902+
// Force handleSettingsChange which will trigger validation
1903+
await provider.codeIndexManager.handleSettingsChange()
1904+
} catch (error) {
1905+
// Validation failed - the error state is already set by handleSettingsChange
1906+
provider.log(
1907+
`Embedder validation failed after provider change: ${error instanceof Error ? error.message : String(error)}`,
1908+
)
1909+
// Send validation error to webview
1910+
await provider.postMessageToWebview({
1911+
type: "indexingStatusUpdate",
1912+
values: provider.codeIndexManager.getCurrentStatus(),
1913+
})
1914+
// Exit early - don't try to start indexing with invalid configuration
1915+
break
1916+
}
1917+
} else {
1918+
// No provider change, just handle settings normally
1919+
try {
1920+
await provider.codeIndexManager.handleSettingsChange()
1921+
} catch (error) {
1922+
// Log but don't fail - settings are saved
1923+
provider.log(
1924+
`Settings change handling error: ${error instanceof Error ? error.message : String(error)}`,
1925+
)
1926+
}
1927+
}
1928+
1929+
// Wait a bit more to ensure everything is ready
1930+
await new Promise((resolve) => setTimeout(resolve, 200))
18891931

18901932
// Auto-start indexing if now enabled and configured
18911933
if (provider.codeIndexManager.isFeatureEnabled && provider.codeIndexManager.isFeatureConfigured) {
18921934
if (!provider.codeIndexManager.isInitialized) {
1893-
await provider.codeIndexManager.initialize(provider.contextProxy)
1935+
try {
1936+
await provider.codeIndexManager.initialize(provider.contextProxy)
1937+
provider.log(`Code index manager initialized after settings save`)
1938+
} catch (error) {
1939+
provider.log(
1940+
`Code index initialization failed: ${error instanceof Error ? error.message : String(error)}`,
1941+
)
1942+
// Send error status to webview
1943+
await provider.postMessageToWebview({
1944+
type: "indexingStatusUpdate",
1945+
values: provider.codeIndexManager.getCurrentStatus(),
1946+
})
1947+
}
18941948
}
1895-
provider.codeIndexManager.startIndexing()
18961949
}
18971950
}
1898-
1899-
// Send success response
1900-
await provider.postMessageToWebview({
1901-
type: "codeIndexSettingsSaved",
1902-
success: true,
1903-
settings: globalStateConfig,
1904-
})
1905-
1906-
// Update webview state
1907-
await provider.postStateToWebview()
19081951
} catch (error) {
19091952
provider.log(`Error saving code index settings: ${error.message || error}`)
19101953
await provider.postMessageToWebview({

src/i18n/locales/ca/embeddings.json

Lines changed: 19 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,12 @@
1010
"couldNotReadErrorBody": "No s'ha pogut llegir el cos de l'error",
1111
"requestFailed": "La sol·licitud de l'API d'Ollama ha fallat amb l'estat {{status}} {{statusText}}: {{errorBody}}",
1212
"invalidResponseStructure": "Estructura de resposta no vàlida de l'API d'Ollama: no s'ha trobat la matriu \"embeddings\" o no és una matriu.",
13-
"embeddingFailed": "La incrustació d'Ollama ha fallat: {{message}}"
13+
"embeddingFailed": "La incrustació d'Ollama ha fallat: {{message}}",
14+
"serviceNotRunning": "El servei d'Ollama no s'està executant a {{baseUrl}}",
15+
"serviceUnavailable": "El servei d'Ollama no està disponible (estat: {{status}})",
16+
"modelNotFound": "No s'ha trobat el model d'Ollama: {{modelId}}",
17+
"modelNotEmbeddingCapable": "El model d'Ollama no és capaç de fer incrustacions: {{modelId}}",
18+
"hostNotFound": "No s'ha trobat l'amfitrió d'Ollama: {{baseUrl}}"
1419
},
1520
"scanner": {
1621
"unknownErrorProcessingFile": "Error desconegut en processar el fitxer {{filePath}}",
@@ -19,5 +24,18 @@
1924
},
2025
"vectorStore": {
2126
"qdrantConnectionFailed": "No s'ha pogut connectar a la base de dades vectorial Qdrant. Assegura't que Qdrant estigui funcionant i sigui accessible a {{qdrantUrl}}. Error: {{errorMessage}}"
27+
},
28+
"validation": {
29+
"authenticationFailed": "Ha fallat l'autenticació. Comproveu la vostra clau d'API a la configuració.",
30+
"connectionFailed": "No s'ha pogut connectar al servei d'incrustació. Comproveu la vostra configuració de connexió i assegureu-vos que el servei estigui funcionant.",
31+
"modelNotAvailable": "El model especificat no està disponible. Comproveu la vostra configuració de model.",
32+
"configurationError": "Configuració d'incrustació no vàlida. Reviseu la vostra configuració.",
33+
"serviceUnavailable": "El servei d'incrustació no està disponible. Assegureu-vos que estigui funcionant i sigui accessible.",
34+
"invalidEndpoint": "Punt final d'API no vàlid. Comproveu la vostra configuració d'URL.",
35+
"invalidEmbedderConfig": "Configuració d'incrustació no vàlida. Comproveu la vostra configuració.",
36+
"invalidApiKey": "Clau d'API no vàlida. Comproveu la vostra configuració de clau d'API.",
37+
"invalidBaseUrl": "URL base no vàlida. Comproveu la vostra configuració d'URL.",
38+
"invalidModel": "Model no vàlid. Comproveu la vostra configuració de model.",
39+
"invalidResponse": "Resposta no vàlida del servei d'incrustació. Comproveu la vostra configuració."
2240
}
2341
}

src/i18n/locales/de/embeddings.json

Lines changed: 19 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,12 @@
1010
"couldNotReadErrorBody": "Fehlerinhalt konnte nicht gelesen werden",
1111
"requestFailed": "Ollama API-Anfrage fehlgeschlagen mit Status {{status}} {{statusText}}: {{errorBody}}",
1212
"invalidResponseStructure": "Ungültige Antwortstruktur von Ollama API: \"embeddings\" Array nicht gefunden oder kein Array.",
13-
"embeddingFailed": "Ollama Einbettung fehlgeschlagen: {{message}}"
13+
"embeddingFailed": "Ollama Einbettung fehlgeschlagen: {{message}}",
14+
"serviceNotRunning": "Ollama-Dienst wird unter {{baseUrl}} nicht ausgeführt",
15+
"serviceUnavailable": "Ollama-Dienst ist nicht verfügbar (Status: {{status}})",
16+
"modelNotFound": "Ollama-Modell nicht gefunden: {{modelId}}",
17+
"modelNotEmbeddingCapable": "Ollama-Modell ist nicht für Einbettungen geeignet: {{modelId}}",
18+
"hostNotFound": "Ollama-Host nicht gefunden: {{baseUrl}}"
1419
},
1520
"scanner": {
1621
"unknownErrorProcessingFile": "Unbekannter Fehler beim Verarbeiten der Datei {{filePath}}",
@@ -19,5 +24,18 @@
1924
},
2025
"vectorStore": {
2126
"qdrantConnectionFailed": "Verbindung zur Qdrant-Vektordatenbank fehlgeschlagen. Stelle sicher, dass Qdrant läuft und unter {{qdrantUrl}} erreichbar ist. Fehler: {{errorMessage}}"
27+
},
28+
"validation": {
29+
"authenticationFailed": "Authentifizierung fehlgeschlagen. Bitte überprüfe deinen API-Schlüssel in den Einstellungen.",
30+
"connectionFailed": "Verbindung zum Embedder-Dienst fehlgeschlagen. Bitte überprüfe deine Verbindungseinstellungen und stelle sicher, dass der Dienst läuft.",
31+
"modelNotAvailable": "Das angegebene Modell ist nicht verfügbar. Bitte überprüfe deine Modellkonfiguration.",
32+
"configurationError": "Ungültige Embedder-Konfiguration. Bitte überprüfe deine Einstellungen.",
33+
"serviceUnavailable": "Der Embedder-Dienst ist nicht verfügbar. Bitte stelle sicher, dass er läuft und erreichbar ist.",
34+
"invalidEndpoint": "Ungültiger API-Endpunkt. Bitte überprüfe deine URL-Konfiguration.",
35+
"invalidEmbedderConfig": "Ungültige Embedder-Konfiguration. Bitte überprüfe deine Einstellungen.",
36+
"invalidApiKey": "Ungültiger API-Schlüssel. Bitte überprüfe deine API-Schlüssel-Konfiguration.",
37+
"invalidBaseUrl": "Ungültige Basis-URL. Bitte überprüfe deine URL-Konfiguration.",
38+
"invalidModel": "Ungültiges Modell. Bitte überprüfe deine Modellkonfiguration.",
39+
"invalidResponse": "Ungültige Antwort vom Embedder-Dienst. Bitte überprüfe deine Konfiguration."
2240
}
2341
}

src/i18n/locales/en/embeddings.json

Lines changed: 19 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,12 @@
1010
"couldNotReadErrorBody": "Could not read error body",
1111
"requestFailed": "Ollama API request failed with status {{status}} {{statusText}}: {{errorBody}}",
1212
"invalidResponseStructure": "Invalid response structure from Ollama API: \"embeddings\" array not found or not an array.",
13-
"embeddingFailed": "Ollama embedding failed: {{message}}"
13+
"embeddingFailed": "Ollama embedding failed: {{message}}",
14+
"serviceNotRunning": "Ollama service is not running at {{baseUrl}}",
15+
"serviceUnavailable": "Ollama service is unavailable (status: {{status}})",
16+
"modelNotFound": "Ollama model not found: {{modelId}}",
17+
"modelNotEmbeddingCapable": "Ollama model is not embedding capable: {{modelId}}",
18+
"hostNotFound": "Ollama host not found: {{baseUrl}}"
1419
},
1520
"scanner": {
1621
"unknownErrorProcessingFile": "Unknown error processing file {{filePath}}",
@@ -19,5 +24,18 @@
1924
},
2025
"vectorStore": {
2126
"qdrantConnectionFailed": "Failed to connect to Qdrant vector database. Please ensure Qdrant is running and accessible at {{qdrantUrl}}. Error: {{errorMessage}}"
27+
},
28+
"validation": {
29+
"authenticationFailed": "Authentication failed. Please check your API key in the settings.",
30+
"connectionFailed": "Failed to connect to the embedder service. Please check your connection settings and ensure the service is running.",
31+
"modelNotAvailable": "The specified model is not available. Please check your model configuration.",
32+
"configurationError": "Invalid embedder configuration. Please review your settings.",
33+
"serviceUnavailable": "The embedder service is not available. Please ensure it is running and accessible.",
34+
"invalidEndpoint": "Invalid API endpoint. Please check your URL configuration.",
35+
"invalidEmbedderConfig": "Invalid embedder configuration. Please check your settings.",
36+
"invalidApiKey": "Invalid API key. Please check your API key configuration.",
37+
"invalidBaseUrl": "Invalid base URL. Please check your URL configuration.",
38+
"invalidModel": "Invalid model. Please check your model configuration.",
39+
"invalidResponse": "Invalid response from embedder service. Please check your configuration."
2240
}
2341
}

src/i18n/locales/es/embeddings.json

Lines changed: 19 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,12 @@
1010
"couldNotReadErrorBody": "No se pudo leer el cuerpo del error",
1111
"requestFailed": "La solicitud de la API de Ollama falló con estado {{status}} {{statusText}}: {{errorBody}}",
1212
"invalidResponseStructure": "Estructura de respuesta inválida de la API de Ollama: array \"embeddings\" no encontrado o no es un array.",
13-
"embeddingFailed": "Incrustación de Ollama falló: {{message}}"
13+
"embeddingFailed": "Incrustación de Ollama falló: {{message}}",
14+
"serviceNotRunning": "El servicio Ollama no se está ejecutando en {{baseUrl}}",
15+
"serviceUnavailable": "El servicio Ollama no está disponible (estado: {{status}})",
16+
"modelNotFound": "No se encuentra el modelo Ollama: {{modelId}}",
17+
"modelNotEmbeddingCapable": "El modelo Ollama no es capaz de realizar incrustaciones: {{modelId}}",
18+
"hostNotFound": "No se encuentra el host de Ollama: {{baseUrl}}"
1419
},
1520
"scanner": {
1621
"unknownErrorProcessingFile": "Error desconocido procesando archivo {{filePath}}",
@@ -19,5 +24,18 @@
1924
},
2025
"vectorStore": {
2126
"qdrantConnectionFailed": "Error al conectar con la base de datos vectorial Qdrant. Asegúrate de que Qdrant esté funcionando y sea accesible en {{qdrantUrl}}. Error: {{errorMessage}}"
27+
},
28+
"validation": {
29+
"authenticationFailed": "Error de autenticación. Comprueba tu clave de API en los ajustes.",
30+
"connectionFailed": "Error al conectar con el servicio de embedder. Comprueba los ajustes de conexión y asegúrate de que el servicio esté funcionando.",
31+
"modelNotAvailable": "El modelo especificado no está disponible. Comprueba la configuración de tu modelo.",
32+
"configurationError": "Configuración de embedder no válida. Revisa tus ajustes.",
33+
"serviceUnavailable": "El servicio de embedder no está disponible. Asegúrate de que esté funcionando y sea accesible.",
34+
"invalidEndpoint": "Punto de conexión de API no válido. Comprueba la configuración de tu URL.",
35+
"invalidEmbedderConfig": "Configuración de embedder no válida. Comprueba tus ajustes.",
36+
"invalidApiKey": "Clave de API no válida. Comprueba la configuración de tu clave de API.",
37+
"invalidBaseUrl": "URL base no válida. Comprueba la configuración de tu URL.",
38+
"invalidModel": "Modelo no válido. Comprueba la configuración de tu modelo.",
39+
"invalidResponse": "Respuesta no válida del servicio de embedder. Comprueba tu configuración."
2240
}
2341
}

src/i18n/locales/fr/embeddings.json

Lines changed: 19 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,12 @@
1010
"couldNotReadErrorBody": "Impossible de lire le corps de l'erreur",
1111
"requestFailed": "Échec de la requête API Ollama avec le statut {{status}} {{statusText}} : {{errorBody}}",
1212
"invalidResponseStructure": "Structure de réponse invalide de l'API Ollama : tableau \"embeddings\" non trouvé ou n'est pas un tableau.",
13-
"embeddingFailed": "Échec de l'embedding Ollama : {{message}}"
13+
"embeddingFailed": "Échec de l'embedding Ollama : {{message}}",
14+
"serviceNotRunning": "Le service Ollama n'est pas en cours d'exécution sur {{baseUrl}}",
15+
"serviceUnavailable": "Le service Ollama est indisponible (statut : {{status}})",
16+
"modelNotFound": "Modèle Ollama introuvable : {{modelId}}",
17+
"modelNotEmbeddingCapable": "Le modèle Ollama n'est pas capable d'intégrer : {{modelId}}",
18+
"hostNotFound": "Hôte Ollama introuvable : {{baseUrl}}"
1419
},
1520
"scanner": {
1621
"unknownErrorProcessingFile": "Erreur inconnue lors du traitement du fichier {{filePath}}",
@@ -19,5 +24,18 @@
1924
},
2025
"vectorStore": {
2126
"qdrantConnectionFailed": "Échec de la connexion à la base de données vectorielle Qdrant. Veuillez vous assurer que Qdrant fonctionne et est accessible à {{qdrantUrl}}. Erreur : {{errorMessage}}"
27+
},
28+
"validation": {
29+
"authenticationFailed": "Échec de l'authentification. Veuillez vérifier votre clé API dans les paramètres.",
30+
"connectionFailed": "Échec de la connexion au service d'embedding. Veuillez vérifier vos paramètres de connexion et vous assurer que le service est en cours d'exécution.",
31+
"modelNotAvailable": "Le modèle spécifié n'est pas disponible. Veuillez vérifier la configuration de votre modèle.",
32+
"configurationError": "Configuration de l'embedder invalide. Veuillez vérifier vos paramètres.",
33+
"serviceUnavailable": "Le service d'embedding n'est pas disponible. Veuillez vous assurer qu'il est en cours d'exécution et accessible.",
34+
"invalidEndpoint": "Point de terminaison d'API invalide. Veuillez vérifier votre configuration d'URL.",
35+
"invalidEmbedderConfig": "Configuration de l'embedder invalide. Veuillez vérifier vos paramètres.",
36+
"invalidApiKey": "Clé API invalide. Veuillez vérifier votre configuration de clé API.",
37+
"invalidBaseUrl": "URL de base invalide. Veuillez vérifier votre configuration d'URL.",
38+
"invalidModel": "Modèle invalide. Veuillez vérifier votre configuration de modèle.",
39+
"invalidResponse": "Réponse invalide du service d'embedder. Veuillez vérifier votre configuration."
2240
}
2341
}

src/i18n/locales/hi/embeddings.json

Lines changed: 19 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,12 @@
1010
"couldNotReadErrorBody": "त्रुटि सामग्री पढ़ नहीं सका",
1111
"requestFailed": "Ollama API अनुरोध स्थिति {{status}} {{statusText}} के साथ विफल: {{errorBody}}",
1212
"invalidResponseStructure": "Ollama API से अमान्य प्रतिक्रिया संरचना: \"embeddings\" सरणी नहीं मिली या सरणी नहीं है।",
13-
"embeddingFailed": "Ollama एम्बेडिंग विफल: {{message}}"
13+
"embeddingFailed": "Ollama एम्बेडिंग विफल: {{message}}",
14+
"serviceNotRunning": "ओलामा सेवा {{baseUrl}} पर नहीं चल रही है",
15+
"serviceUnavailable": "ओलामा सेवा अनुपलब्ध है (स्थिति: {{status}})",
16+
"modelNotFound": "ओलामा मॉडल नहीं मिला: {{modelId}}",
17+
"modelNotEmbeddingCapable": "ओलामा मॉडल एम्बेडिंग में सक्षम नहीं है: {{modelId}}",
18+
"hostNotFound": "ओलामा होस्ट नहीं मिला: {{baseUrl}}"
1419
},
1520
"scanner": {
1621
"unknownErrorProcessingFile": "फ़ाइल {{filePath}} प्रसंस्करण में अज्ञात त्रुटि",
@@ -19,5 +24,18 @@
1924
},
2025
"vectorStore": {
2126
"qdrantConnectionFailed": "Qdrant वेक्टर डेटाबेस से कनेक्ट करने में विफल। कृपया सुनिश्चित करें कि Qdrant चल रहा है और {{qdrantUrl}} पर पहुंच योग्य है। त्रुटि: {{errorMessage}}"
27+
},
28+
"validation": {
29+
"authenticationFailed": "प्रमाणीकरण विफल। कृपया सेटिंग्स में अपनी एपीआई कुंजी जांचें।",
30+
"connectionFailed": "एम्बेडर सेवा से कनेक्ट करने में विफल। कृपया अपनी कनेक्शन सेटिंग्स जांचें और सुनिश्चित करें कि सेवा चल रही है।",
31+
"modelNotAvailable": "निर्दिष्ट मॉडल उपलब्ध नहीं है। कृपया अपनी मॉडल कॉन्फ़िगरेशन जांचें।",
32+
"configurationError": "अमान्य एम्बेडर कॉन्फ़िगरेशन। कृपया अपनी सेटिंग्स की समीक्षा करें।",
33+
"serviceUnavailable": "एम्बेडर सेवा उपलब्ध नहीं है। कृपया सुनिश्चित करें कि यह चल रहा है और पहुंच योग्य है।",
34+
"invalidEndpoint": "अमान्य एपीआई एंडपॉइंट। कृपया अपनी यूआरएल कॉन्फ़िगरेशन जांचें।",
35+
"invalidEmbedderConfig": "अमान्य एम्बेडर कॉन्फ़िगरेशन। कृपया अपनी सेटिंग्स जांचें।",
36+
"invalidApiKey": "अमान्य एपीआई कुंजी। कृपया अपनी एपीआई कुंजी कॉन्फ़िगरेशन जांचें।",
37+
"invalidBaseUrl": "अमान्य बेस यूआरएल। कृपया अपनी यूआरएल कॉन्फ़िगरेशन जांचें।",
38+
"invalidModel": "अमान्य मॉडल। कृपया अपनी मॉडल कॉन्फ़िगरेशन जांचें।",
39+
"invalidResponse": "एम्बेडर सेवा से अमान्य प्रतिक्रिया। कृपया अपनी कॉन्फ़िगरेशन जांचें।"
2240
}
2341
}

0 commit comments

Comments
 (0)