Skip to content

Conversation

@arilotter
Copy link
Collaborator

@arilotter arilotter commented Jan 7, 2026

required a bunch of upstream changes in nixpkgs :)

closes #470

@arilotter arilotter force-pushed the vlllm branch 5 times, most recently from 6a0cfa3 to c1d3801 Compare January 7, 2026 16:42
fixes some cuda lib path problems in vllm
updates mdbook
fixes some clippy lints

later, we should consider splitting the vllm code into a separate set of
deps, so the regular psyche binaries don't get vllm included
@arilotter
Copy link
Collaborator Author

all inference tests seem to pass under vllm!

$ nix develop .#dev-python --command "cargo test -p psyche-inference --features vllm-tests"

passed :)

@samherring99 anything else to do here? are these tests run under CI somehow?

@arilotter arilotter marked this pull request as ready for review January 8, 2026 20:22
@arilotter arilotter requested a review from samherring99 January 8, 2026 20:22
@arilotter arilotter enabled auto-merge January 8, 2026 20:23
@samherring99
Copy link
Collaborator

LGTM! the tests are run in CI without the --features vllm-tests branch, so I guess we could remove that to ensure vLLM is working in CI every time, but otherwise this is great! Happy to make that change in another PR to unblock these changes.

Copy link
Collaborator

@samherring99 samherring99 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🚀

@arilotter arilotter added this pull request to the merge queue Jan 8, 2026
Merged via the queue into main with commit b8e1464 Jan 8, 2026
35 checks passed
arilotter added a commit that referenced this pull request Jan 11, 2026
arilotter added a commit that referenced this pull request Jan 11, 2026
Reverts #464

failure with `Cannot load symbol cudnnGetVersion` on CUDA gpu after loading model :(

```
nix develop
just setup-localnet-light-test-run
just start-training-localnet-light-client
```

on an h100 machine.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Package VLLM dependency with Nix

3 participants