Skip to content

Conversation

pepijndevos
Copy link

This PR makes it possible to plug an AMD GPU into a Raspberry Pi 5 and run llama.cpp directly on the pi for completely integrated local LLMs.

I'm marking this as a draft to open a discussion about what would be an acceptable way to upstream these changes.

What makes up this PR:

and tangentially related, some changes related to my CM5 GPU board Sentinel Core that definitely won't fly as is:

  • enable the external wifi antenna
  • add my LLM addon repository

But maybe these would be acceptable as a new board definition? Some of the other changes could maybe also be scoped to this board?

Copy link
Member

@sairon sairon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In my view these changes are too invasive (e.g. AMDGPU driver and firmwares will cause unnecessary bloat for most of the users) given how niche use case this covers, so my call is not to accept these changes here.

For better news, in the past we have discussed how we could make OS forks maintainable more easily - both for developers and for the users, as now your only option to upgrade a forked build is to do that manually from command line or using the offline upgrade with a RAUC upgrade file. As we're also moving away from Supervised installs, we should start looking into this rather soon.

@pepijndevos
Copy link
Author

Is there a way to add these dependencies to this specific board only that would make it acceptabel?

Indeed better ways to maintain these changes externally would be greatly appreciated. My current solution is a custom addon that interfaces with rauc. That means that pretty much updating the addon means updating the OS. But it's a bit confusing to users, so if I could hook into the version JSON that would be much better.

https://github.com/sanctuary-systems-com/llm-addons

@pepijndevos
Copy link
Author

Talking about forks, is there a sanctioned way to preconfigure certain addons and integrations?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants