Skip to content

Commit fe262da

Browse files
authored
fix(llama): add pretrained_init_configuration to LlamaConfig (#6710)
1 parent ae777fd commit fe262da

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

paddlenlp/transformers/llama/configuration.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -206,6 +206,7 @@ class LlamaConfig(PretrainedConfig):
206206
"n_inner": "intermediate_size",
207207
"activation_function": "hidden_act",
208208
}
209+
pretrained_init_configuration = LLAMA_PRETRAINED_INIT_CONFIGURATION
209210

210211
def __init__(
211212
self,

0 commit comments

Comments
 (0)