This guide covers how to build and deploy the credit risk model with Feature Store integration on JFrogML. The training happens during the build process via the build() method, and the predict() method handles serving with real-time Feature Store access.
Prerequisites: Complete Feature Store Setup & Testing first.
✅ Feature Store Setup Complete: You should have already completed the Feature Store Setup & Testing guide, which includes:
- JFrogML CLI installation and configuration
- Data source and feature set registration
- Feature Store validation
If you haven't completed Phase 1, please do that first before proceeding.
Understanding the required project structure for JFrogML deployment:
.
├── main/ # Main directory containing core code
│ ├── __init__.py # Python package marker (required)
│ ├── model.py # FrogMLModel implementation with build() and predict()
│ ├── utils.py # Data preprocessing utilities
│ └── conda.yaml # Conda environment dependencies
main/: Directory containing all core model code and dependencies__init__.py: Empty file that makesmain/a Python package for importsmodel.py: FrogMLModel class with key methods:build(): Training logic (runs during build process)initialize_model(): Runtime initialization at deploymentpredict(): Inference logic (runs during serving) with Feature Store integrationschema(): Defines which features to pull from Feature Store for enrichment
utils.py: Data preprocessing and cleaning utilitiesconda.yaml: Environment dependencies (Python version, packages, etc.)
- Build Process: JFrogML reads
conda.yaml→ creates environment → imports frommain/model.py→ runsbuild()method → packages everything - Deployment: Uses the trained model and
predict()method for serving with Feature Store access
Before building on JFrogML, validate your code locally for faster feedback:
# Test your model locally using JFrogML's run_local utility
python test_model_locally.pyThis uses JFrogML's run_local SDK utility to:
- Validate your
FrogMLModelimplementation - Test
build()andpredict()methods locally with Feature Store - Catch issues before triggering remote builds
- Provide faster development iteration
Before building, create your model in the JFrog platform:
- Navigate to JFrog UI → AI/ML section
- Create New Model → Name: "Credit Risk with Feature Store"
- Copy the Model ID generated (you'll need this for CLI commands)
This associates your code with a specific model in the JFrog platform for tracking and management.
The build process executes your build() method (which contains training logic) and packages everything for deployment:
# Build the model (run from feature_set_quickstart_guide/ directory - the . picks up code from current dir)
frogml models build --model-id credit_risk_model . --instance medium
# This will return a Build ID (UUID) - copy it for deployment
# Example output: Build ID: f47ac10b-58cc-4372-a567-0e02b2c3d479
# View build logs (includes training logs)
frogml models builds logs -b <your_build_id> -f
# See all build command parameters
frogml models build --helpWhat happens during build:
- Feature Store Connection: Connects to registered Feature Store components
- Training: Your
build()method runs with offline features from Feature Store - Packaging: Creates deployment-ready container with trained model and Feature Store connections
- Validation: Ensures model and serving logic are ready with Feature Store integration
- Build ID Generated: Copy this ID for deployment commands
# Deploy as real-time endpoint (use the Build ID from previous step)
frogml models deploy realtime --model-id credit_risk_model --build-id <your-build-id>
# See all realtime deployment parameters
frogml models deploy realtime --helpTest the endpoint:
python test_live_model.py- Code to Production: Single platform for building, training, and serving ML models with Feature Store
- FrogMLModel Framework: Standardized approach with
build()for training andpredict()for serving - Feature Store Integration: Seamless online/offline feature access during training and inference
- Scalable Infrastructure: Auto-scaling real-time and batch inference endpoints with feature serving
- JFrog Integration: Seamless integration with JFrog Artifactory for model artifacts
- Security & Governance: Enterprise security controls and model governance
- Feature Store Management: Centralized feature engineering, versioning, and serving
- Monitoring & Observability: Built-in model performance monitoring and feature drift detection
- CLI & UI: Flexible interaction via command line or web interface
- Model Versioning: Automatic versioning and artifact management
- Feature Store Tools: Easy feature registration, backfill, and monitoring
- Testing Tools: Local testing capabilities with Feature Store integration before deployment