Skip to content

Commit 073c5cb

Browse files
authored
Fix ChatGLM3 and Phi3 issue with Transformers 4.38.1.(#4980)(#4983)
1 parent 3fe8fa2 commit 073c5cb

File tree

1 file changed

+3
-1
lines changed
  • intel_extension_for_pytorch/transformers/models/xpu/optimize_transformers/modules

1 file changed

+3
-1
lines changed

intel_extension_for_pytorch/transformers/models/xpu/optimize_transformers/modules/Functions.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2078,9 +2078,11 @@ def ipex_disable_attn_mask_prepare(model):
20782078

20792079
model_list = {
20802080
transformers.models.llama.modeling_llama.LlamaForCausalLM: "LlamaModel",
2081-
transformers.models.phi3.modeling_phi3.Phi3ForCausalLM: "Phi3Model",
20822081
transformers.models.qwen2.modeling_qwen2.Qwen2ForCausalLM: "Qwen2Model",
20832082
}
2083+
# phi3 is not an attribute of transformers.models in Transformers 4.38.1.
2084+
if hasattr(transformers.models, "phi3"):
2085+
model_list[transformers.models.phi3.modeling_phi3.Phi3ForCausalLM] = "Phi3Model"
20842086

20852087
if type(model) in model_list.keys():
20862088
base_module = type(model).__base__.__module__

0 commit comments

Comments
 (0)