Skip to content

Example code to run batched inference? #3

@learning-chip

Description

@learning-chip

Nice work! In the paper I saw this batched result:

batch

But examples like https://github.com/Infini-AI-Lab/TriForce/blob/main/test/on_chip.py only use batch size=1. Does the code supports batched speculative inference?

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions