- Quick Start Guide
- License
- Installation
- Community
-
Installation Parameters
-
Deployments
-
Servers
-
Routing
-
Wrappers and SDKs
- Python Language Wrapper
- [Go Language Wrapper]
- Java Language Wrapper
- Nodejs Language Wrapper
- C++ Language Wrapper
- R Language Wrapper
-
Integrations
- Notebooks
- Install Seldon Core
- Install MinIO
- Deploy a Scikit-learn Model Binary
- Deploy a Tensorflow Exported Model
- MLflow Pre-packaged Model Server A/B Test
- MLflow Open Inference Protocol End to End Workflow
- Deploy a XGBoost Model Binary
- Deploy Pre-packaged Model Server with Cluster's MinIO
- Custom Pre-packaged LightGBM Server
- SKLearn Spacy NLP
- SKLearn Iris Classifier
- Sagemaker SKLearn Example
- TFserving MNIST
- Statsmodels Holt-Winter's time-series model
- Runtime Metrics & Tags
- Triton GPT2 Example
- NVIDIA TensorRT MNIST
- OpenVINO ImageNet
- OpenVINO ImageNet Ensemble
- Triton Examples
- Kubeflow Seldon E2E Pipeline
- H2O Java MoJo
- Outlier Detection with Combiner
- Stream Processing with KNative Eventing
- Kafka CIFAR10
- Kafka SpaCy SKlearn NLP
- Kafka KEDA Autoscaling
- CPP Wrapper Simple Single File
- Advanced CPP Buildsystem Override
- Environment Variables
- AWS EKS Tensorflow Deep MNIST
- Azure AKS Tensorflow Deep MNIST
- GKE with GPU Tensorflow Deep MNIST
- Alibaba Cloud Tensorflow Deep MNIST
- Triton GPT2 Example Azure
- Setup for Triton GPT2 Example Azure
- Real Time Monitoring of Statistical Metrics
- Model Explainer Example
- Model Explainer Open Inference Protocol Example
- Outlier Detection on CIFAR10
- Training Outlier Detector for CIFAR10 with Poetry
- Batch Processing with Argo Workflows and S3 / Minio
- Batch Processing with Argo Workflows and HDFS
- Batch Processing with Kubeflow Pipelines
- Autoscaling Example
- KEDA Autoscaling example
- Request Payload Logging with ELK
- Custom Metrics with Grafana & Prometheus
- Distributed Tracing with Jaeger
- CI / CD with Jenkins Classic
- CI / CD with Jenkins X
- Replica control
- Example Helm Deployments
- Max gRPC Message Size
- Deploy Multiple Seldon Core Operators
- Protocol Examples
- Configurable timeouts
- Custom Protobuf Data Example
- Disruption Budgets Example
- Istio AB Test
- Ambassador AB Test
- Seldon/Iter8 - Progressive AB Test with Single Seldon Deployment
- Seldon/Iter8 - Progressive AB Test with Multiple Seldon Deployments
- Chainer MNIST
- Custom pre-processors with the Open Inference Protocol
- Graph Examples
- Ambassador Canary
- Ambassador Shadow
- Ambassador Headers
- Istio Examples
- Istio Canary
- Patch Volumes for Version 1.2.0 Upgrade
- Service Orchestrator Overhead
- Tensorflow Benchmark
- Argo Workflows Benchmarking
- Python Serialization Cost Benchmark
- KMP_AFFINITY Benchmarking Example
- Kafka Payload Logging
- RClone Storage Initializer - testing new secret format
- RClone Storage Initializer - upgrading your cluster (AWS S3 / MinIO)
- Annotation Based Configuration
- Benchmarking
- General Availability
- Helm Charts
- Images
- Logging and Log Level
- Private Docker Registry
- [Prediction APIs]
- Open Inference Protocol
- Scalar Value Types *
- Microservice API
- External API
- Prediction Proto Buffer Spec
- [Prediction Open API Spec] *
- [Python API Reference]*
- Release Highlights
- Release 1.7.0 Hightlights
- Release 1.6.0 Hightlights
- Release 1.5.0 Hightlights
- Release 1.1.0 Hightlights
- Release 1.0.0 Hightlights
- Release 0.4.1 Hightlights
- Release 0.4.0 Hightlights
- Release 0.3.0 Hightlights
- Release 0.2.7 Hightlights
- Release 0.2.6 Hightlights
- Release 0.2.5 Hightlights
- Release 0.2.3 Hightlights
- [Seldon Deployment CRD]*
- Service Orchestrator
- Kubeflow
- Archived Docs