File tree Expand file tree Collapse file tree 1 file changed +4
-4
lines changed Expand file tree Collapse file tree 1 file changed +4
-4
lines changed Original file line number Diff line number Diff line change @@ -506,20 +506,20 @@ impl LlamaModel {
506
506
}
507
507
}
508
508
509
- /// Get chat template from model by name. If the name is None, the default chat template will be returned.
509
+ /// Get chat template from model by name. If the name parameter is None, the default chat template will be returned.
510
510
///
511
511
/// You supply this into [Self::apply_chat_template] to get back a string with the appropriate template
512
512
/// substitution applied to convert a list of messages into a prompt the LLM can use to complete
513
513
/// the chat.
514
514
///
515
- /// You could also use an external jinja parser, like minijinja, to parse jinja templates not
516
- /// supported by the llama.cpp template engine. .
515
+ /// You could also use an external jinja parser, like [ minijinja](https://github.com/mitsuhiko/minijinja),
516
+ /// to parse jinja templates not supported by the llama.cpp template engine.
517
517
///
518
518
/// # Errors
519
519
///
520
520
/// * If the model has no chat template by that name
521
521
/// * If the chat template is not a valid [`CString`].
522
- pub fn get_chat_template (
522
+ fn chat_template (
523
523
& self ,
524
524
name : Option < & str > ,
525
525
) -> Result < LlamaChatTemplate , ChatTemplateError > {
You can’t perform that action at this time.
0 commit comments