You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/source/reference/llms.rst
+95Lines changed: 95 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -436,12 +436,107 @@ The main goal of these primitives is to:
436
436
LLMWrapperBase
437
437
TransformersWrapper
438
438
vLLMWrapper
439
+
RemoteTransformersWrapper
440
+
RemotevLLMWrapper
439
441
ChatHistory
440
442
Text
441
443
LogProbs
442
444
Masks
443
445
Tokens
444
446
447
+
Remote Wrappers
448
+
^^^^^^^^^^^^^^^
449
+
450
+
TorchRL provides remote wrapper classes that enable distributed execution of LLM wrappers using Ray. These wrappers provide a simplified interface that doesn't require explicit `remote()` and `get()` calls, making them easy to use in distributed settings.
451
+
452
+
**Key Features:**
453
+
454
+
- **Simplified Interface**: No need to call `remote()` and `get()` explicitly
455
+
- **Full API Compatibility**: Exposes all public methods from the base `LLMWrapperBase` class
456
+
- **Automatic Ray Management**: Handles Ray initialization and remote execution internally
457
+
- **Property Access**: All properties are accessible through the remote wrapper
458
+
- **Error Handling**: Proper error propagation from remote actors
459
+
- **Resource Management**: Context manager support for automatic cleanup
460
+
461
+
**Model Parameter Requirements:**
462
+
463
+
- **RemotevLLMWrapper**: Accepts string model names/paths (recommended) or remote vLLM LLM objects with ray handles. Local vLLM models are not serializable.
464
+
- **RemoteTransformersWrapper**: Only accepts string model names/paths. Transformers models are not serializable.
465
+
466
+
**Usage Examples:**
467
+
468
+
.. code-block:: python
469
+
470
+
import ray
471
+
from torchrl.modules.llm.policies import RemotevLLMWrapper, RemoteTransformersWrapper
472
+
from torchrl.data.llm import History
473
+
from torchrl.modules.llm.policies import ChatHistory, Text
474
+
from tensordict import TensorDict
475
+
476
+
# Initialize Ray (if not already done)
477
+
ifnot ray.is_initialized():
478
+
ray.init()
479
+
480
+
# Use context manager for proper cleanup (recommended)
0 commit comments