A TorchServe-based medical imaging model for automated breast density classification from mammogram images using deep learning. This model classifies breast density into the standard BI-RADS categories (A, B, C, D) and generates comprehensive medical reports.
This project implements a production-ready breast density classification system that:
- Uses transfer learning with Inception v3 architecture
- Supports multiple mammography acquisition protocols (FFDM, C-View, Tomosynthesis)
- Generates AI-powered medical reports via LLM integration
- Deploys as a TorchServe model for scalable inference
- Integrates with cloud storage (GCS) for data management
graph TB
subgraph "Input Layer"
A[Mammogram Image] --> B[Input Validation]
B --> C[Image Preprocessing]
end
subgraph "Inference Pipeline"
C --> D[MONAI Dataset Creation]
D --> E[Data Augmentation]
E --> F[Inception v3 Model]
F --> G[Softmax Classification]
end
subgraph "Report Generation"
G --> H[Output Processing]
H --> I[LLM Client]
I --> J[Med42 Report Generation]
J --> K[PDF Conversion]
end
subgraph "Output Layer"
K --> L[GCS Upload]
L --> M[Response JSON]
end
subgraph "Configuration"
N[Model Config] --> F
O[TorchServe Config] --> P[Handler]
P --> F
end
graph LR
subgraph "Transfer Learning Pipeline"
A[ImageNet Pre-trained<br/>Inception v3] --> B[Feature Extraction<br/>Layers]
B --> C[Global Average<br/>Pooling]
C --> D[Fully Connected<br/>Layer 1024]
D --> E[Dropout<br/>Layer]
E --> F[Output Layer<br/>4 classes]
F --> G[Softmax<br/>Activation]
end
subgraph "Input Processing"
H[Raw Mammogram<br/>299x299x3] --> I[Normalization<br/>0.0-1.0]
I --> J[Channel First<br/>Format]
J --> A
end
subgraph "Output Classes"
G --> K[Class A: Almost Entirely Fat]
G --> L[Class B: Scattered Fibroglandular]
G --> M[Class C: Heterogeneously Dense]
G --> N[Class D: Extremely Dense]
end
sequenceDiagram
participant Client
participant TorchServe
participant Handler
participant Model
participant LLM
participant GCS
Client->>TorchServe: POST /predictions/monai-breast-density-classification
TorchServe->>Handler: preprocess(data)
Handler->>Handler: validate_input()
Handler->>Handler: create_dataset()
Handler->>Model: inference()
Model->>Handler: classification_output
Handler->>LLM: generate_report()
LLM->>Handler: medical_report.pdf
Handler->>GCS: upload_files()
GCS->>Handler: gs://bucket/path
Handler->>TorchServe: response_json
TorchServe->>Client: inference_result
fetch/
βββ config/ # Configuration files
β βββ model-definition.json # Model metadata and workflow
β βββ model-config.yaml # TorchServe model configuration
β βββ config.properties # TorchServe server settings
β βββ archiver.properties # MAR packaging configuration
β βββ long_description.md # Model documentation
βββ inference_monai/ # Core inference package
β βββ inference_handler.py # Main TorchServe handler
β βββ config.py # Environment configuration
β βββ inference_types.py # Input/Output type definitions
β βββ inference_utils.py # Utility functions
β βββ report_generator/ # Medical report generation
β βββ report_generator.py # Main report generation logic
β βββ prompts.py # LLM prompt templates
β βββ llm_interface/ # LLM client implementations
β βββ llm_client.py # Main LLM client
β βββ clients/ # Provider-specific clients
β βββ openai.py # OpenAI client
β βββ perplexity.py # Perplexity client
βββ weights/ # Model weights
β βββ get_weights.sh # Weight download script
β βββ model.pt # Trained model weights
βββ examples/ # Sample data and configurations
βββ resources/ # Additional resources
β βββ paper.md # Research paper reference
βββ utils/ # Utility modules
βββ setup.py # Package installation
- Python 3.8+
- CUDA-compatible GPU (recommended)
- TorchServe
- Required API keys (OpenAI/Perplexity for report generation)
- Clone and install dependencies:
git clone <repository-url>
cd fetch
pip install -e .
- Download model weights:
cd weights
chmod +x get_weights.sh
./get_weights.sh
- Set up environment variables:
export OPENAI_API_KEY="your-openai-key"
# or
export PERPLEXITY_API_KEY="your-perplexity-key"
- Create MAR file:
torch-model-archiver \
--model-name monai-breast-density-classification \
--version 1.0 \
--handler inference_monai/inference_handler.py \
--config-file config/model-config.yaml \
--serialized-file weights/model.pt \
--export-path model-store \
--extra-files config/,inference_monai/,resources/
- Start TorchServe:
torchserve --start \
--model-store model-store \
--models monai-breast-density-classification.mar \
--ts-config config/config.properties
- Make inference requests:
curl -X POST http://localhost:8085/predictions/monai-breast-density-classification \
-H "Content-Type: application/json" \
-d '{
"id": "test-001",
"bucket": "your-bucket",
"root_path": "test-session",
"input_folder": "input",
"input_files": ["mammogram.jpg"],
"output_folder": "output"
}'
The model expects JSON input with the following structure:
{
"id": "unique-session-id",
"bucket": "gcs-bucket-name",
"root_path": "session-directory",
"input_folder": "input-subfolder",
"input_files": ["mammogram.jpg"],
"output_folder": "output-subfolder"
}
The model returns a JSON response containing:
{
"id": "unique-session-id",
"model_name": "BreastDensityClassification",
"outputs": [
{
"name": "output_path",
"shape": [1],
"datatype": "BYTES",
"data": ["gs://bucket/path/to/report.pdf"]
}
]
}
- Source: Mayo Clinic PACS (June-September 2021)
- Total Images: 151,164 mammograms
- Patients: 26,411 patients
- Acquisition Types: FFDM, C-View, Tomosynthesis Projection, Intelligent 2D
- Architecture: Inception v3 with transfer learning
- Input Size: 299Γ299Γ3 (RGB)
- Classes: 4 (BI-RADS categories A, B, C, D)
- Training: Multi-protocol dataset for improved generalization
- A: Almost entirely fat
- B: Scattered fibroglandular density
- C: Heterogeneously dense
- D: Extremely dense
The system generates comprehensive medical reports using:
- Model Output Processing: Converts raw classification scores to interpretable results
- LLM Integration: Uses OpenAI or Perplexity APIs for report generation
- Medical Context: Incorporates research literature and clinical guidelines
- PDF Generation: Converts markdown reports to professional PDF format
- Model Details and Technical Information
- Key Findings and Observations
- Clinical Significance and Implications
- Recommendations for Follow-up
- Limitations and Considerations
- References to Medical Literature
- Workers: 1 (configurable in
model-config.yaml
) - Batch Size: 16
- Timeout: 60 seconds
- Memory: 32GB recommended
- GPU: NVIDIA L4 or equivalent
LOCAL_EXECUTION=false # Set to true for local testing
OPENAI_API_KEY=your-key # For OpenAI-based report generation
PERPLEXITY_API_KEY=your-key # For Perplexity-based report generation
# Install development dependencies
pip install -e ".[dev]"
# Run tests (if available)
python -m pytest tests/
# Set local execution mode
export LOCAL_EXECUTION=true
# Test inference handler directly
python -c "
from inference_monai.inference_handler import BreastDensityClassificationHandler
handler = BreastDensityClassificationHandler()
# Add your test code here
"
TorchServe provides built-in metrics at:
- Inference API: http://localhost:8085
- Management API: http://localhost:8086
- Metrics API: http://localhost:8082 (Prometheus format)
- Single Image Processing: Currently processes one mammogram at a time
- Image Format: Requires JPEG format mammograms
- Size Requirements: Images are resized to 299Γ299 pixels
- Internet Connection: Report generation requires API access to LLM services
- Medical Disclaimer: Outputs are for research purposes and require clinical validation
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
This project is licensed under the MIT License - see the LICENSE file for details.
- Gupta, V., et al. "A Multi Reconstruction Study of Breast Density Estimation Using Deep Learning." arXiv:2202.08238 (2022)
- MONAI Model Zoo: https://monai.io/model-zoo.html
- BI-RADS Atlas: https://www.acr.org/Clinical-Resources/Reporting-and-Data-Systems/Bi-Rads
If you use this model in your research, please cite:
@article{gupta2022multi,
title={A Multi Reconstruction Study of Breast Density Estimation Using Deep Learning},
author={Gupta, Vikash and Demirer, Mutlu and Maxwell, Robert W and White, Richard D and Erdal, Barbaros S},
journal={arXiv preprint arXiv:2202.08238},
year={2022}
}