Skip to content

Add Ministral 3 instruct support#3415

Open
dhandhalyabhavik wants to merge 2 commits intoopenvinotoolkit:latestfrom
dhandhalyabhavik:ministral-3-support
Open

Add Ministral 3 instruct support#3415
dhandhalyabhavik wants to merge 2 commits intoopenvinotoolkit:latestfrom
dhandhalyabhavik:ministral-3-support

Conversation

@dhandhalyabhavik
Copy link
Copy Markdown

This PR adds a notebook for running Ministral-3 (Mistral AI's multimodal model) with OpenVINO, supporting both 3B and 8B model variants.

What's included

Tested configurations

Tested and verified Ministral-3-3B-Instruct-2512-BF16 and Ministral-3-8B-Instruct-2512-BF16 both models,
Verified its working for CPU, dGPU, iGPU.

Note

openvino_genai.VLMPipeline does not support mistral3 model type yet, so this notebook uses the optimum-intel OVModelForVisualCausalLM pipeline for inference.

- Add ministral-3.ipynb notebook for Ministral-3-3B-Instruct-2512 VLM
- Uses OVModelForVisualCausalLM from optimum-intel PR openvinotoolkit#1659
- INT4 weight compression by default
- Includes gradio_helper.py for interactive chatbot demo
- Exports from mistralai/Ministral-3-3B-Instruct-2512-BF16 model
…demo image

- Add model dropdown for 3B and 8B variants (Ministral-3-3B/8B-Instruct-2512-BF16)
- Remove broken device=GPU cell that crashed the kernel
- Replace ipywidgets device_widget with plain Python device selection
- Use qwen3-vl demo image (beach scene) for image inference
- Tested INT4 on both CPU and GPU for both model sizes
@review-notebook-app
Copy link
Copy Markdown

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant