Skip to content

Commit 6c2640f

Browse files
committed
rename get_chat_template to chat_template
1 parent 21eee35 commit 6c2640f

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

llama-cpp-2/src/model.rs

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -506,20 +506,20 @@ impl LlamaModel {
506506
}
507507
}
508508

509-
/// Get chat template from model by name. If the name is None, the default chat template will be returned.
509+
/// Get chat template from model by name. If the name parameter is None, the default chat template will be returned.
510510
///
511511
/// You supply this into [Self::apply_chat_template] to get back a string with the appropriate template
512512
/// substitution applied to convert a list of messages into a prompt the LLM can use to complete
513513
/// the chat.
514514
///
515-
/// You could also use an external jinja parser, like minijinja, to parse jinja templates not
516-
/// supported by the llama.cpp template engine..
515+
/// You could also use an external jinja parser, like [minijinja](https://github.com/mitsuhiko/minijinja),
516+
/// to parse jinja templates not supported by the llama.cpp template engine.
517517
///
518518
/// # Errors
519519
///
520520
/// * If the model has no chat template by that name
521521
/// * If the chat template is not a valid [`CString`].
522-
pub fn get_chat_template(
522+
fn chat_template(
523523
&self,
524524
name: Option<&str>,
525525
) -> Result<LlamaChatTemplate, ChatTemplateError> {

0 commit comments

Comments
 (0)