Skip to content

Update related llama.cpp to support Intel AMX instruction #1827

@nai-kon

Description

@nai-kon

Thank you for this great project!
I find that the base llama.cpp is linked to vendor/llama.cpp. It seems the latest llama-cpp-python v0.3.1 is linked to llama.cpp with two months old.

Last month, llama.cpp supported for Intel AMX extension instructions, which is expected to provide significant performance improvements on some Intel CPUs.
ggml-org/llama.cpp#8998

Therefore, I hope you to release a llama-cpp-python with the latest llama.cpp link.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions