Skip to content

Commit aead8a9

Browse files
emojiiiichenyuankun
andauthored
Fix optional chaining for batch size calculation in PreTrainedModel (#1063)
Co-authored-by: chenyuankun <[email protected]>
1 parent 11db949 commit aead8a9

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/models.js

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1814,7 +1814,7 @@ export class PreTrainedModel extends Callable {
18141814
const dtype = session?.config?.kv_cache_dtype ?? 'float32';
18151815
const empty = (dtype === 'float16') ? new Uint16Array() : [];
18161816

1817-
const batch_size = (decoderFeeds[this.main_input_name] ?? decoderFeeds.attention_mask).dims?.[0] ?? 1;
1817+
const batch_size = (decoderFeeds[this.main_input_name] ?? decoderFeeds.attention_mask)?.dims?.[0] ?? 1;
18181818
const shapes = getKeyValueShapes(this.config, { batch_size });
18191819

18201820
for (const name in shapes) {

0 commit comments

Comments
 (0)