66msgstr ""
77"Project-Id-Version : Xinference \n "
88"Report-Msgid-Bugs-To : \n "
9- "POT-Creation-Date : 2025-08-04 12:08 +0800\n "
9+ "POT-Creation-Date : 2025-10-21 13:27 +0800\n "
1010"PO-Revision-Date : YEAR-MO-DA HO:MI+ZONE\n "
1111"Last-Translator : FULL NAME <EMAIL@ADDRESS>\n "
1212"Language : zh_CN\n "
@@ -376,11 +376,21 @@ msgstr ""
376376"向量生成:`https://platform.openai.com/docs/api-reference/embeddings <"
377377"https://platform.openai.com/docs/api-reference/embeddings>`_"
378378
379- #: ../../source/getting_started/using_xinference.rst:330
379+ #: ../../source/getting_started/using_xinference.rst:329
380+ msgid ""
381+ "Xinference also supports Anthropic API via base url "
382+ "``http://127.0.0.1:9997/anthropic``, you can use Xinference in Claude "
383+ "Code and so forth. Refer to :ref:`anthropic client <anthropic_client>` "
384+ "for more details."
385+ msgstr ""
386+ "Xinference 还支持通过基础 URL ``http://127.0.0.1:9997/anthropic`` 调用 Anthropic API,"
387+ "你可以在 Claude Code 等环境中使用 Xinference。更多详情请参阅 :ref:`anthropic client <anthropic_client>`。"
388+
389+ #: ../../source/getting_started/using_xinference.rst:333
380390msgid "Manage Models"
381391msgstr "管理模型"
382392
383- #: ../../source/getting_started/using_xinference.rst:332
393+ #: ../../source/getting_started/using_xinference.rst:335
384394msgid ""
385395"In addition to launching models, Xinference offers various ways to manage"
386396" the entire lifecycle of models. You can manage models in Xinference "
@@ -389,37 +399,37 @@ msgstr ""
389399"除了启动模型,Xinference 提供了管理模型整个生命周期的能力。同样的,你可以"
390400"使用命令行、cURL 以及 Python 代码来管理:"
391401
392- #: ../../source/getting_started/using_xinference.rst:335
402+ #: ../../source/getting_started/using_xinference.rst:338
393403msgid ""
394404"You can list all models of a certain type that are available to launch in"
395405" Xinference:"
396406msgstr "可以列出所有 Xinference 支持的指定类型的模型:"
397407
398- #: ../../source/getting_started/using_xinference.rst:353
408+ #: ../../source/getting_started/using_xinference.rst:356
399409msgid ""
400410"The following command gives you the currently running models in "
401411"Xinference:"
402412msgstr "接下来的命令可以列出所有在运行的模型:"
403413
404- #: ../../source/getting_started/using_xinference.rst:371
414+ #: ../../source/getting_started/using_xinference.rst:374
405415msgid ""
406416"When you no longer need a model that is currently running, you can remove"
407417" it in the following way to free up the resources it occupies:"
408418msgstr "当你不需要某个正在运行的模型,可以通过以下的方式来停止它并释放资源:"
409419
410- #: ../../source/getting_started/using_xinference.rst:392
420+ #: ../../source/getting_started/using_xinference.rst:395
411421msgid "Deploy Xinference In a Cluster"
412422msgstr "集群中部署 Xinference"
413423
414- #: ../../source/getting_started/using_xinference.rst:394
424+ #: ../../source/getting_started/using_xinference.rst:397
415425msgid ""
416426"To deploy Xinference in a cluster, you need to start a Xinference "
417427"supervisor on one server and Xinference workers on the other servers."
418428msgstr ""
419429"若要在集群环境中部署 Xinference,需要在一台机器中启动 supervisor 节点,并"
420430"在当前或者其他节点启动 worker 节点"
421431
422- #: ../../source/getting_started/using_xinference.rst:397
432+ #: ../../source/getting_started/using_xinference.rst:400
423433msgid ""
424434"First, make sure you have already installed Xinference on each of the "
425435"servers according to the instructions provided :ref:`here "
@@ -428,23 +438,23 @@ msgstr ""
428438"首先,根据 :ref:`文档 <installation>` 确保所有的服务器上都安装了 "
429439"Xinference。接下来按照步骤:"
430440
431- #: ../../source/getting_started/using_xinference.rst:401
441+ #: ../../source/getting_started/using_xinference.rst:404
432442msgid "Start the Supervisor"
433443msgstr "启动 Supervisor"
434444
435- #: ../../source/getting_started/using_xinference.rst:402
445+ #: ../../source/getting_started/using_xinference.rst:405
436446msgid ""
437447"On the server where you want to run the Xinference supervisor, run the "
438448"following command:"
439449msgstr "在服务器上执行以下命令来启动 Supervisor 节点:"
440450
441- #: ../../source/getting_started/using_xinference.rst:408
451+ #: ../../source/getting_started/using_xinference.rst:411
442452msgid ""
443453"Replace ``${supervisor_host}`` with the actual host of your supervisor "
444454"server."
445455msgstr "用当前节点的 IP 来替换 ``${supervisor_host}``。"
446456
447- #: ../../source/getting_started/using_xinference.rst:411
457+ #: ../../source/getting_started/using_xinference.rst:414
448458msgid ""
449459"You can the supervisor's web UI at `http://${supervisor_host}:9997/ui "
450460"<http://${supervisor_host}:9997/ui>`_ and visit "
@@ -455,23 +465,23 @@ msgstr ""
455465"/ui>`_ 访问 web UI,在 `http://${supervisor_host}:9997/docs <http://${"
456466"supervisor_host}:9997/docs>`_ 访问 API 文档。"
457467
458- #: ../../source/getting_started/using_xinference.rst:415
468+ #: ../../source/getting_started/using_xinference.rst:418
459469msgid "Start the Workers"
460470msgstr "启动 Worker"
461471
462- #: ../../source/getting_started/using_xinference.rst:417
472+ #: ../../source/getting_started/using_xinference.rst:420
463473msgid ""
464474"On each of the other servers where you want to run Xinference workers, "
465475"run the following command:"
466476msgstr "在需要启动 Xinference worker 的机器上执行以下命令:"
467477
468- #: ../../source/getting_started/using_xinference.rst:424
478+ #: ../../source/getting_started/using_xinference.rst:427
469479msgid ""
470480"Note that you must replace ``${worker_host}`` with the actual host of "
471481"your worker server."
472482msgstr "需要注意的是,必须使用当前Worker节点的 IP 来替换 ``${worker_host}``。"
473483
474- #: ../../source/getting_started/using_xinference.rst:427
484+ #: ../../source/getting_started/using_xinference.rst:430
475485msgid ""
476486"Note that if you need to interact with the Xinference in a cluster via "
477487"the command line, you should include the ``-e`` or ``--endpoint`` flag to"
@@ -480,56 +490,55 @@ msgstr ""
480490"需要注意的是,如果你需要通过命令行与集群交互,应该通过 ``-e`` 或者 ``--"
481491"endpoint`` 参数来指定 supervisor 的地址,比如:"
482492
483- #: ../../source/getting_started/using_xinference.rst:435
493+ #: ../../source/getting_started/using_xinference.rst:438
484494msgid "Using Xinference With Docker"
485495msgstr "使用 Docker 部署 Xinference"
486496
487- #: ../../source/getting_started/using_xinference.rst:437
497+ #: ../../source/getting_started/using_xinference.rst:440
488498msgid "To start Xinference in a Docker container, run the following command:"
489499msgstr "用以下命令在容器中运行 Xinference:"
490500
491- #: ../../source/getting_started/using_xinference.rst:440
501+ #: ../../source/getting_started/using_xinference.rst:443
492502msgid "Run On Nvidia GPU Host"
493503msgstr "在拥有英伟达显卡的机器上运行"
494504
495- #: ../../source/getting_started/using_xinference.rst:442
505+ #: ../../source/getting_started/using_xinference.rst:445
496506msgid "For cuda 12.4:"
497507msgstr "对于 cuda 12.4:"
498508
499- #: ../../source/getting_started/using_xinference.rst:448
509+ #: ../../source/getting_started/using_xinference.rst:451
500510msgid "For cuda 12.8:"
501511msgstr "对于 cuda 12.8:"
502512
503- #: ../../source/getting_started/using_xinference.rst:450
513+ #: ../../source/getting_started/using_xinference.rst:453
504514msgid ""
505515"CUDA 12.8 version is experimental, welcome to give us feedbacks to help "
506516"us to improve."
507- msgstr ""
508- "CUDA 12.8 版本是实验性质,欢迎给我们反馈以改进。"
517+ msgstr "CUDA 12.8 版本是实验性质,欢迎给我们反馈以改进。"
509518
510- #: ../../source/getting_started/using_xinference.rst:458
519+ #: ../../source/getting_started/using_xinference.rst:461
511520msgid "Run On CPU Only Host"
512521msgstr "在只有 CPU 的机器上运行"
513522
514- #: ../../source/getting_started/using_xinference.rst:464
523+ #: ../../source/getting_started/using_xinference.rst:467
515524msgid ""
516525"Replace ``<your_version>`` with Xinference versions, e.g. ``v0.10.3``, "
517526"``latest`` can be used for the latest version."
518527msgstr ""
519528"将 ``<your_version>`` 替换为 Xinference 的版本,比如 ``v0.10.3``,可以用 "
520529"``latest`` 来用于最新版本。"
521530
522- #: ../../source/getting_started/using_xinference.rst:466
531+ #: ../../source/getting_started/using_xinference.rst:469
523532msgid ""
524533"For more docker usage, refer to :ref:`Using Docker Image "
525534"<using_docker_image>`."
526535msgstr "更多 docker 使用,请参考 :ref:`使用 docker 镜像 <using_docker_image>`。"
527536
528- #: ../../source/getting_started/using_xinference.rst:470
537+ #: ../../source/getting_started/using_xinference.rst:473
529538msgid "What's Next?"
530539msgstr "更多"
531540
532- #: ../../source/getting_started/using_xinference.rst:472
541+ #: ../../source/getting_started/using_xinference.rst:475
533542msgid ""
534543"Congratulations on getting started with Xinference! To help you navigate "
535544"and make the most out of this powerful tool, here are some resources and "
@@ -538,12 +547,12 @@ msgstr ""
538547"恭喜你,已经初步掌握了 Xinference 的用法!为了帮助你更好地使用工具,下面"
539548"是其他的一些文档和指导资源:"
540549
541- #: ../../source/getting_started/using_xinference.rst:475
550+ #: ../../source/getting_started/using_xinference.rst:478
542551msgid ""
543552":ref:`How to Use Client APIs for Different Types of Models "
544553"<user_guide_client_api>`"
545554msgstr ":ref:`如何使用 Python 创建不同类型的模型 <user_guide_client_api>`"
546555
547- #: ../../source/getting_started/using_xinference.rst:477
556+ #: ../../source/getting_started/using_xinference.rst:480
548557msgid ":ref:`Choosing the Right Backends for Your Needs <user_guide_backends>`"
549558msgstr ":ref:`选择正确的推理引擎 <user_guide_backends>` "
0 commit comments