diff --git a/.github/workflows/run-readme-pr-mps.yml b/.github/workflows/run-readme-pr-mps.yml index 718d5cf9e..3e90265f5 100644 --- a/.github/workflows/run-readme-pr-mps.yml +++ b/.github/workflows/run-readme-pr-mps.yml @@ -10,6 +10,7 @@ jobs: uses: pytorch/test-infra/.github/workflows/macos_job.yml@main with: runner: macos-m1-14 + timeout-minutes: 50 script: | conda create -y -n test-readme-mps-macos python=3.10.11 llvm-openmp conda activate test-readme-mps-macos diff --git a/README.md b/README.md index 4b910e575..3c37edf09 100644 --- a/README.md +++ b/README.md @@ -231,6 +231,8 @@ python3 torchchat.py server llama3.1 ``` [skip default]: end +[shell default]: python3 torchchat.py server llama3.1 & server_pid=$! + In another terminal, query the server using `curl`. Depending on the model configuration, this query might take a few minutes to respond. > [!NOTE] @@ -244,8 +246,6 @@ Setting `stream` to "true" in the request emits a response in chunks. If `stream **Example Input + Output** -[skip default]: begin - ``` curl http://127.0.0.1:5000/v1/chat/completions \ -H "Content-Type: application/json" \ @@ -265,12 +265,14 @@ curl http://127.0.0.1:5000/v1/chat/completions \ ] }' ``` +[skip default]: begin ``` {"response":" I'm a software developer with a passion for building innovative and user-friendly applications. I have experience in developing web and mobile applications using various technologies such as Java, Python, and JavaScript. I'm always looking for new challenges and opportunities to learn and grow as a developer.\n\nIn my free time, I enjoy reading books on computer science and programming, as well as experimenting with new technologies and techniques. I'm also interested in machine learning and artificial intelligence, and I'm always looking for ways to apply these concepts to real-world problems.\n\nI'm excited to be a part of the developer community and to have the opportunity to share my knowledge and experience with others. I'm always happy to help with any questions or problems you may have, and I'm looking forward to learning from you as well.\n\nThank you for visiting my profile! I hope you find my information helpful and interesting. If you have any questions or would like to discuss any topics, please feel free to reach out to me. I"} ``` [skip default]: end +[shell default]: kill ${server_pid} @@ -664,6 +666,6 @@ awesome libraries and tools you've built around local LLM inference. torchchat is released under the [BSD 3 license](LICENSE). (Additional code in this distribution is covered by the MIT and Apache Open Source -licenses.) However you may have other legal obligations that govern +licenses.) However, you may have other legal obligations that govern your use of content, such as the terms of service for third-party models.