Skip to content

Conversation

@guangy10
Copy link
Contributor

@guangy10 guangy10 commented Oct 29, 2024

What does this PR do?

DistilBert is ExecuTorch compatible.

Unit Test:
RUN_SLOW=1 pytest tests/models/distilbert/test_modeling_distilbert.py -k test_export -v

tests/models/distilbert/test_modeling_distilbert.py::DistilBertModelIntergrationTest::test_export PASSED                                                                                [100%]

E2E test in ExecuTorch:
Patch pytorch/executorch#6509
python -m extension.export_util.export_hf_model -hfm="distilbert-base-uncased" -lm masked_lm

Saved exported program to ./distilbert.pte

./cmake-out/backends/xnnpack/xnn_executor_runner --model_path distilbert.pte

I 00:00:00.080326 executorch:executor_runner.cpp:82] Model file distilbert.pte is loaded.
I 00:00:00.080359 executorch:executor_runner.cpp:91] Using method forward
I 00:00:00.080361 executorch:executor_runner.cpp:138] Setting up planned buffer 0, size 12286720.
I 00:00:00.115094 executorch:executor_runner.cpp:161] Method loaded.
I 00:00:00.115124 executorch:executor_runner.cpp:171] Inputs prepared.
I 00:00:00.179285 executorch:executor_runner.cpp:180] Model executed successfully.
I 00:00:00.179301 executorch:executor_runner.cpp:184] 1 outputs:
Output 0: tensor(sizes=[1, 64, 30522], [
  -4.47825, -4.55548, -4.59359, -4.61276, -4.71701, -4.22803, -4.54525, -4.30736, -4.532, -4.9645,
  -4.19537, -4.51069, -4.34262, -4.96867, -4.38696, -5.06627, -5.01279, -4.89841, -4.42651, -4.47658,
  -4.70912, -4.49927, -4.48796, -4.67513, -4.3218, -4.54809, -4.59159, -4.65592, -4.54133, -4.50207,
  -4.24141, -4.65805, -4.49932, -4.36075, -4.38477, -4.69771, -4.76032, -5.06464, -4.57687, -4.54149,
  -4.54834, -4.80815, -4.47513, -4.61154, -4.69458, -4.09497, -4.42706, -4.48752, -4.84431, -4.40653,
  -4.6515, -4.60421, -4.39167, -4.9955, -4.65156, -4.57042, -4.58516, -4.46815, -4.43985, -4.83551,
  -4.20381, -4.59275, -4.94262, -4.32183, -4.44933, -4.59167, -4.66095, -4.85241, -4.83965, -4.37491,
  -4.82371, -4.34802, -4.26705, -4.79766, -4.47379, -4.7745, -4.59805, -4.6717, -4.2979, -4.65086,
  -4.88208, -4.84994, -4.24183, -4.73356, -4.97729, -5.18642, -4.64655, -4.64227, -4.46517, -4.6624,
  -4.50896, -4.75761, -4.26062, -4.75898, -4.7547, -4.54612, -4.43117, -4.4847, -4.28017, -4.33875,
  ...,
  -2.56383, -0.124811, -1.62058, -0.539149, -2.0116, -2.13068, 0.614868, -1.62362, -2.73875, -0.295115,
  -2.33206, 0.223186, -3.19978, -2.81419, -0.764227, 0.385865, -3.02447, -4.4802, -3.33432, -1.58703,
  -1.79603, -2.96534, -1.06687, -3.17183, -1.81405, 0.0236263, -0.992222, -3.71788, 0.761198, 0.089091,
  -2.99735, -2.04351, -2.40324, -2.86246, -1.24337, -2.34749, -2.01503, -2.45599, -4.6185, 1.14074,
  -3.04769, -1.78048, -1.09878, -3.30111, -2.08858, -1.64816, -2.03306, -1.94704, -0.205174, -1.90752,
  -2.6837, -1.25019, -0.415001, -3.73985, -1.53322, -0.605044, -3.7232, -0.258519, -1.85742, -1.55172,
  -4.25782, -3.31136, -1.23, -1.60789, -2.16738, -2.58743, 0.324617, 0.266767, -2.14392, -2.59203,
  -1.90562, -3.10258, -1.81314, 1.15056, -3.81185, -2.48559, -2.03798, -2.57377, -2.39025, -1.43463,
  -0.672718, -1.97253, -3.45209, -1.31699, -0.362099, -2.69917, -3.11479, -3.16947, -0.0704084, 0.330248,
  -3.50465, -3.19989, -4.00352, -3.97841, -2.49317, -4.99941, -4.31784, -3.77685, -4.15103, 3.47488,
])

Before submitting

Who can review?

@ArthurZucker
@qubvel

@guangy10 guangy10 mentioned this pull request Oct 29, 2024
33 tasks
@guangy10 guangy10 changed the title DistillBERT is ExecuTorch compatible DistilBERT is ExecuTorch compatible Oct 29, 2024
Copy link
Contributor

@qubvel qubvel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, just a few nits!
Can you please also push empty commit[run_slow] distilbert to trigger slow test run, thanks!

@guangy10 guangy10 force-pushed the distilbert_executorch branch from ea7c7b0 to b9afa5c Compare October 29, 2024 23:56
@guangy10 guangy10 force-pushed the distilbert_executorch branch from f8e80c2 to 5f3fbc3 Compare October 30, 2024 00:05
@guangy10 guangy10 requested a review from qubvel October 30, 2024 00:05
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Contributor

@qubvel qubvel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@qubvel qubvel requested a review from ArthurZucker October 30, 2024 08:27
Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!@

@ArthurZucker ArthurZucker merged commit 663c851 into huggingface:main Nov 5, 2024
18 checks passed
BernardZach pushed a commit to BernardZach/transformers that referenced this pull request Dec 5, 2024
* DistillBERT is ExecuTorch compatible

* [run_slow] distilbert

* [run_slow] distilbert

---------

Co-authored-by: Guang Yang <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants