Skip to content

Adjusting Token Limit for Longer Responses(Chat bot) #106

@SeverusYixin

Description

@SeverusYixin

If the chatbot needs to generate longer answers in the future, you should increase the max_new_tokens limit accordingly.

The setting can be found here:

def generate_response(self, prompt, max_new_tokens=500, num_return_sequences=1):

You can base on the PR #105 to set it

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions