Skip to content

[BUG] Error on submit job llama 3.2 3B ai - hub #279

@PhamDangNguyen

Description

@PhamDangNguyen

Hello, first of all, I would like to thank my team for their contributions to the AI community.
I am trying to export the LLaMA 3.2 3B model on QC AI Hub.
However, when I submit the job llama_v3_2_3b_instruct_prompt_2_of_3 and I encounter an error:


Additional Information from the Compile Log
Error occurred due to op : _model_model_layers_0_mlp_gate_proj_MatMul;
op_type : FullyConnected;
reason : BQ is not supported
Failed to validate op _model_model_layers_0_mlp_gate_proj_MatMul with error 0xc26 : QNN_OP_PACKAGE_ERROR_VALIDATION_FAILURE
For additional information please refer to the QNN HTP Backend Op Definition Supplement : https://docs.qualcomm.com/bundle/publicresource/topics/80-63442-10/HtpOpDefSupplement.html ```   



This is the command I ran on the host machine (Windows):
``` python -m qai_hub_models.models.llama_v3_2_3b_instruct.export ^
 --chipset qualcomm-sa8295p ^
 --context-length 1024 ^
 --checkpoint DEFAULT_W4 ^
 --skip-profiling ^
 --skip-inferencing ^
 --output-dir output_bin  ```

Please let me know what error I am encountering and how to resolve it!

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionPlease ask any questions on Slack. This issue will be closed once responded to.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions