Skip to content

gradio interface for 13b model #2

@Gio-CCHAN

Description

@Gio-CCHAN

Hi,
Thanks for the work on chatting interface, the chat.py and running command works good for 7b model(llama-7b-chat).
But when I am about to change it into 13b model, it seems the nature that 13b model run on distributed gpu influenced the running of gradio. the chat.py cannot run correctly after i change the --nproc_per_node to 2.

Can you kindly provide ideas about how to change the chat.py and run it for 13b model(llama-13b-chat)?
Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions