Skip to content

Commit 5a91109

Browse files
authored
model-conversion : add trust_remote_code for orig model run [no ci] (#16751)
This commit add the trust_remote_code=True argument when loading models using AutoConfig, AutoTokenizer, and AutoModelForCausalLM for the run original model script. The motivation for this is that some models require custom code to be loaded properly, and setting trust_remote_code=True avoids a prompt asking for user confirmation: ```console (venv) $ make causal-run-original-model The repository /path/to/model contains custom code which must be executed to correctly load the model. You can inspect the repository content at /path/to/model. Do you wish to run the custom code? [y/N] N ``` Having this as the default seems like a safe choice as we have to clone or download the models we convert and would be expecting to run any custom code they have.
1 parent f8f071f commit 5a91109

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

examples/model-conversion/scripts/causal/run-org-model.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -138,7 +138,7 @@ def fn(_m, input, output):
138138
"Model path must be specified either via --model-path argument or MODEL_PATH environment variable"
139139
)
140140

141-
config = AutoConfig.from_pretrained(model_path)
141+
config = AutoConfig.from_pretrained(model_path, trust_remote_code=True)
142142

143143
print("Model type: ", config.model_type)
144144
print("Vocab size: ", config.vocab_size)
@@ -148,8 +148,8 @@ def fn(_m, input, output):
148148
print("EOS token id: ", config.eos_token_id)
149149

150150
print("Loading model and tokenizer using AutoTokenizer:", model_path)
151-
tokenizer = AutoTokenizer.from_pretrained(model_path)
152-
config = AutoConfig.from_pretrained(model_path)
151+
tokenizer = AutoTokenizer.from_pretrained(model_path, trust_remote_code=True)
152+
config = AutoConfig.from_pretrained(model_path, trust_remote_code=True)
153153

154154
if unreleased_model_name:
155155
model_name_lower = unreleased_model_name.lower()
@@ -171,7 +171,7 @@ def fn(_m, input, output):
171171
exit(1)
172172
else:
173173
model = AutoModelForCausalLM.from_pretrained(
174-
model_path, device_map="auto", offload_folder="offload"
174+
model_path, device_map="auto", offload_folder="offload", trust_remote_code=True
175175
)
176176

177177
for name, module in model.named_modules():

0 commit comments

Comments
 (0)