Skip to content

Commit e4ccf2f

Browse files
srdaspre-commit-ci[bot]dlqqq
authored
Updated documentation for using Ollama with cell magics on non-default port (#1370)
* Using Ollama with magics on non-default port * Update index.md * ollama non-default port settings * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Update docs/source/users/index.md Co-authored-by: David L. Qiu <[email protected]> --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: David L. Qiu <[email protected]>
1 parent 6b3d406 commit e4ccf2f

File tree

2 files changed

+24
-0
lines changed

2 files changed

+24
-0
lines changed
53.5 KB
Loading

docs/source/users/index.md

Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -459,6 +459,30 @@ If you don't see Ollama listed as a model provider in the Jupyter-AI configurati
459459
You can install it with `pip install langchain-ollama` (as of Feb'2025 it is not available on conda-forge).
460460
:::
461461

462+
By default, Ollama is served on `127.0.0.1:11434` (locally on port `11434`), so Jupyter AI expects this by default. If you wish to use a remote Ollama server with a different IP address or a local Ollama server on a different port number, you have to configure this in advance.
463+
464+
To configure this in the chat, set the "Base API URL" field in the AI settings page to your Ollama server's custom IP address and port number:
465+
466+
<img src="../_static/ollama-settings.png"
467+
width="100%"
468+
alt='Screenshot of the settings panel with Ollama on non-default port.'
469+
class="screenshot" />
470+
471+
472+
To configure this in the magic commands, you should set the `OLLAMA_HOST` environment variable to the your Ollama server's custom IP address and port number (assuming you chose 11000) in a new code cell:
473+
474+
```
475+
%load_ext jupyter_ai_magics
476+
os.environ["OLLAMA_HOST"] = "http://localhost:11000"
477+
```
478+
479+
After running that cell, the AI magic command can then be used like so:
480+
481+
```
482+
%%ai ollama:llama3.2
483+
What is a transformer?
484+
```
485+
462486
### vLLM usage
463487

464488
`vLLM` is a fast and easy-to-use library for LLM inference and serving. The [vLLM website](https://docs.vllm.ai/en/latest/) explains installation and usage. To use `vLLM` in Jupyter AI, please see the dedicated documentation page on using [vLLM in Jupyter AI](vllm.md).

0 commit comments

Comments
 (0)