You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
RUN curl -fsSL https://golang.org/dl/go${GOVERSION}.linux-$(case $(uname -m) in x86_64) echo amd64 ;; aarch64) echo arm64 ;; esac).tar.gz | tar xz -C /usr/local
23
+
ENV PATH=/usr/local/go/bin:$PATH
24
+
25
+
RUN wget https://storage.openvinotoolkit.org/repositories/openvino_genai/packages/nightly/2025.2.0.0.dev20250320/openvino_genai_ubuntu20_2025.2.0.0.dev20250320_x86_64.tar.gz
26
+
RUN tar -xzf openvino_genai_ubuntu20_2025.2.0.0.dev20250320_x86_64.tar.gz
We provide two ways to download the executable file of Ollama, one is to download it from Google Drive, and the other is to download it from Baidu Drive:
@@ -591,11 +591,21 @@ We provide two ways to download the executable file of Ollama, one is to downloa
The native Ollama only supports models in the GGUF format, the Ollama-OV invoke OpenVINO GenAI which requires models in the OpenVINO format. Therefore, we have enabled support for OpenVINO model files in Ollama. For public LLMs, you can access and download OpenVINO IR model from HuggingFace or ModelScope:
@@ -630,10 +640,10 @@ The native Ollama only supports models in the GGUF format, the Ollama-OV invoke
630
640
631
641
Just provide above model link as example for part models, for other LLMs, you can check [OpenVINO GenAI model support list](https://github.com/openvinotoolkit/openvino.genai/blob/master/SUPPORTED_MODELS.md). If you have customized LLM, please follow [model conversion step of GenAI](https://github.com/openvinotoolkit/openvino.genai?tab=readme-ov-file#converting-and-compressing-text-generation-model-from-hugging-face-library).
632
642
633
-
## Performance to reference
643
+
<!--## Performance to reference
634
644
| Model | Ollama-OV with GPU (Driver version:32.0.101.6042) | Ollama-OV with NPU (Driver version:32.0.100.3104) | Device
`ollama serve` is used when you want to start ollama without running the desktop application.
760
-
761
777
## Building from source
762
778
763
779
Install prerequisites:
@@ -782,12 +798,10 @@ Then build and run Ollama from the root directory of the repository:
782
798
783
799
3. Initialize the GenAI environment
784
800
785
-
Download GenAI runtime from [GenAI](https://storage.openvinotoolkit.org/repositories/openvino_genai/packages/nightly/2025.2.0.0.dev20250320/openvino_genai_windows_2025.2.0.0.dev20250320_x86_64.zip), then extract it to a directory openvino_genai_windows_2025.2.0.0.dev20250320_x86_64.
801
+
Download GenAI runtime from [GenAI](https://storage.openvinotoolkit.org/repositories/openvino_genai/packages/nightly/2025.2.0.0.dev20250320/openvino_genai_windows_2025.2.0.0.dev20250320_x86_64.zip), then extract it to a directory openvino_genai_windows_2025.2.0.0.dev20250320_x86_64.
786
802
```shell
787
803
cd openvino_genai_windows_2025.2.0.0.dev20250320_x86_64
788
804
setupvars.bat
789
-
790
-
set GODEBUG=cgocheck=0
791
805
```
792
806
793
807
4. Setting cgo environment variables
@@ -801,7 +815,7 @@ Then build and run Ollama from the root directory of the repository:
801
815
go build -o ollama.exe
802
816
```
803
817
804
-
6. If you don't want to recompile ollama, you can choose to directly use the compiled executable file, and then initialize the genai environment in `step 3` to run ollama directly. The compiled executable file is placed in the `dist` directory: [ollama](dist/windows/ollama.exe).
818
+
6. If you don't want to recompile ollama, you can choose to directly use the compiled executable file, and then initialize the genai environment in `step 3` to run ollama directly[ollama](https://drive.google.com/file/d/1iizO9iLhSJGFUu6BgY3EwOchrCyzImUN/view?usp=drive_link).
805
819
806
820
But if you encounter the error when executing ollama.exe, it is recommended that you recompile from source code.
807
821
```shell
@@ -822,12 +836,10 @@ Then build and run Ollama from the root directory of the repository:
822
836
823
837
3. Initialize the GenAI environment
824
838
825
-
Download GenAI runtime from [GenAI](https://storage.openvinotoolkit.org/repositories/openvino_genai/packages/nightly/2025.2.0.0.dev20250320/openvino_genai_ubuntu20_2025.2.0.0.dev20250320_x86_64.tar.gz), then extract it to a directory openvino_genai_ubuntu22_2025.2.0.0.dev20250320_x86_64.dev20250308_x86_64.
839
+
Download GenAI runtime from [GenAI](https://storage.openvinotoolkit.org/repositories/openvino_genai/packages/nightly/2025.2.0.0.dev20250320/openvino_genai_ubuntu22_2025.2.0.0.dev20250320_x86_64.tar.gz), then extract it to a directory openvino_genai_ubuntu22_2025.2.0.0.dev20250320_x86_64.
826
840
```shell
827
841
cd openvino_genai_ubuntu22_2025.2.0.0.dev20250320_x86_64
828
842
source setupvars.sh
829
-
830
-
export GODEBUG=cgocheck=0
831
843
```
832
844
833
845
4. Setting cgo environment variables
@@ -841,24 +853,44 @@ Then build and run Ollama from the root directory of the repository:
841
853
go build -o ollama
842
854
```
843
855
844
-
6. If you don't want to recompile ollama, you can choose to directly use the compiled executable file, and then initialize the genai environment in `step 3` to run ollama directly. The compiled executable file is placed in the `dist` directory: [ollama](dist/linux/ollama).
856
+
6. If you don't want to recompile ollama, you can choose to directly use the compiled executable file, and then initialize the genai environment in `step 3` to run ollama directly[ollama](https://drive.google.com/file/d/1HEyZNNCbWSidKNQl4MRsD8FuwEZtdyew/view?usp=drive_link).
845
857
846
858
If you encounter problems during use, it is recommended to rebuild from source.
0 commit comments