Skip to content

how do u set Ollama context window? #5

@hamin

Description

@hamin

According to the Ollama api you can pass it as part of options https://github.com/ollama/ollama/blob/main/docs/faq.md

I see options available in swift-ollama package that you've created here for both /generate and /chat https://github.com/nathanborror/swift-ollama/blob/2b9d6efa3f641ef3d848c495764de712f469ed15/Sources/Ollama/Types/Generate.swift#L9

https://github.com/nathanborror/swift-ollama/blob/2b9d6efa3f641ef3d848c495764de712f469ed15/Sources/Ollama/Types/Chats.swift#L9

but when using swift-gen-kit, I'm not sure if there's a way to expose or set num_ctx option somehow?

This is how I'm using it

     let serviceRequest = ChatServiceRequest(
                    model: model,
                    messages: [
                        Message(role: .system, content: system_prompt),
                        Message(role: .user, content: user_prompt),
                    ])

Also thank you for working on the library. I've tried a few but I think I like the way this one works :)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions