Describe the bug
If I use a custom MLflow model with custom predict function and use a JSON or CSV input, the requests fail as the SageMaker Inference Toolkit converts bytes to string (source: ref) whereas the input_fn for ModelBuilder passes the input data directly into io.BytesIO (source: ref).
For scenarios where we might have a custom JSON input for a custom MLflow model, this doesn't work as JSON currently isn't supported.
To reproduce
- Create an MLflow model with a custom Python definition with a custom predict
- Deploy using ModelBuilder
- Invoke model with JSON model
- Request fails due to io.BytesIO input being a string instead of bytes
Expected behavior
Request succeeds and returns
Screenshots or logs
If applicable, add screenshots or logs to help explain your problem.
System information
A description of your system. Please provide:
- SageMaker Python SDK version: 2.237.0
- Framework name (eg. PyTorch) or algorithm (eg. KMeans): PyTorch / MLflow
- Framework version: NA
- Python version: Python 3.10.14
- CPU or GPU: CPU
- Custom Docker image (Y/N): NA
Additional context
Add any other context about the problem here.