Skip to content
This repository was archived by the owner on Sep 10, 2025. It is now read-only.

Conversation

@iseeyuan
Copy link
Contributor

Those two apis should have the same default max_seq_length. There can be a better location to define this. The max_seq_length in chat is 2048, in https://github.com/pytorch/torchchat/blob/main/torchchat/generate.py#L821.

The way max_seq_length defined in model looks flaky. It should be consistent to the model definition and the cache setup.

Those two apis should have the same default max_seq_length. There can be a better location to define this.
The max_seq_length in chat is 2048, in https://github.com/pytorch/torchchat/blob/main/torchchat/generate.py#L821.

The way max_seq_length defined in model looks flaky. It should be consistent to the model definition and the cache setup.
@pytorch-bot
Copy link

pytorch-bot bot commented Sep 26, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/torchchat/1212

Note: Links to docs will display an error until the docs builds have been completed.

❌ 8 New Failures, 6 Unrelated Failures

As of commit 49687b6 with merge base c454026 (image):

NEW FAILURES - The following jobs have failed:

BROKEN TRUNK - The following jobs failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Meta Open Source bot. label Sep 26, 2024
@Jack-Khuu
Copy link
Contributor

@Jack-Khuu Jack-Khuu closed this Sep 26, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants