Loading a model failed with:
"module 'llama_cpp.llama_chat_format' has no attribute 'get_chat_completion_handler'"
I checked
https://github.com/inference-sh/llama-cpp-python/blob/main/llama_cpp%2Fllama_chat_format.py
and get_chat_completion_handler is missing.
On the other hand in
https://github.com/abetlen/llama-cpp-python/blob/main/llama_cpp%2Fllama_chat_format.py
the same function was present.