-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Labels
core-logicBusiness logic implementationBusiness logic implementationdepends-on-redisRequires Redis clientRequires Redis clientmedium-priorityStandard priorityStandard prioritymvpPhase 1 MVP scopePhase 1 MVP scope
Milestone
Description
π€ User Story
As the core processing layer, I need an analysis service to orchestrate the analysis workflow, integrate with LLM providers, and generate structured hypotheses.
π― Rationale
Core component for LLM-driven analysis pipeline.
β Acceptance Criteria
- Create
aira/core/analysis_service.pywithAnalysisServiceclass - Define prompt templates for incident and data context
- Call OpenAI provider with structured prompts
- Parse LLM responses into
AnalysisResultmodel - Cache results in Redis (
analysis_cache_hits, TTL 1 hour) - Confidence scoring based on data completeness and response quality
- Metrics:
analysis_requests_total,analysis_duration_seconds,llm_api_calls_total,llm_response_time_seconds - Error handling with circuit breaker, fallback to cache
- Unit & integration tests
- Honor
LLM_TIMEOUTfrom configuration
π Metadata
-
Status: MVP
-
Category: Core Workflow
-
Week: Week 5
-
Complexity: Medium
-
Critical Path: No
-
Dependencies: AIRA-21
Original Ticket: #53
Phase 1 MVP Tracking Issue
Metadata
Metadata
Assignees
Labels
core-logicBusiness logic implementationBusiness logic implementationdepends-on-redisRequires Redis clientRequires Redis clientmedium-priorityStandard priorityStandard prioritymvpPhase 1 MVP scopePhase 1 MVP scope
Projects
Status
No status