Skip to content

Commit 0e0f758

Browse files
update Open Source Docs from Roblox internal teams
1 parent e7bd8c8 commit 0e0f758

File tree

3 files changed

+91
-0
lines changed

3 files changed

+91
-0
lines changed

content/common/navigation/engine/guides.yaml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -307,6 +307,8 @@ navigation:
307307
path: /characters/name-health-display
308308
- title: R6 to R15 adapter
309309
path: /characters/r6-to-r15-adapter
310+
- title: Generate text
311+
path: /characters/generate-text
310312
- title: Players
311313
path: /players/
312314
section:
Lines changed: 88 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,88 @@
1+
---
2+
title: Generate text
3+
description: Use an LLM to generate text for NPC dialog, help systems, and more.
4+
---
5+
6+
<Alert severity="warning">
7+
The text generation API is currently in beta. To get started with the API, fill out [this form](https://survey.roblox.com/jfe/form/SV_3QnTMZaFmYe2FTM). You must be ID verified, and your experience must have a **Moderate** or **Restricted** content maturity rating.
8+
</Alert>
9+
10+
The text generation API lets you use a large language model (LLM) to generate text based on a system prompt from you and a user prompt from the player. The most common use of the API is for creating interactive non-player characters (NPCs).
11+
12+
For example, in a survival experience, your system prompt for a talking animal might be `"You are a very busy beaver. You end all statements by mentioning how you need to get back to work on your dam."`. Users could ask the beaver about water in the area, the size of a nearby forest, predators, etc.
13+
14+
The novelty of LLM responses can help create unique, delightful moments for players, but using the API effectively requires a bit of creativity and tuning. System prompts can be very extensive, so don't hesitate to include a long string with lots of detail.
15+
16+
## Generate text
17+
18+
Generating text from a user prompt generally requires at least two scripts: a server script to make the HTTP request and a client script to get user input and display the generated text. The following scripts are a minimalist example of this scenario that use a hard-coded user input and show the generated text in an NPC chat bubble.
19+
20+
For a more full-featured example, see the [demo experience](https://www.roblox.com/games/90182386287074/DemoAI). Click the **&ctdot;** button and **Edit in Studio**.
21+
22+
- The text generation API is an [Open Cloud API](../cloud/index.md), meaning that the request requires a [path](../cloud/reference/patterns.md), formed from your universe ID. You can find your universe ID in the overflow menu of the experience tile on the [Creator Hub](https://create.roblox.com/dashboard/creations).
23+
- You must include the [Open Cloud client package](../production/promotion/experience-notifications.md#include-the-package) in your experience; the server script requires it.
24+
- The text generation API currently only supports RCC authentication. As a result, you must use [Team Test](../studio/home-tab.md#team-test) to test the API within your experience.
25+
26+
```lua title="Client script"
27+
local ReplicatedStorage = game:GetService("ReplicatedStorage")
28+
local TextChatService = game:GetService("TextChatService")
29+
local ChatEvent = ReplicatedStorage:WaitForChild("ChatEvent")
30+
31+
ChatEvent.OnClientEvent:Connect(function(part, message)
32+
TextChatService:DisplayBubble(part, message)
33+
-- Optionally print for debug purposes
34+
print("LLM output: " .. message)
35+
end)
36+
```
37+
38+
```lua title="Server script in ServerScriptService"
39+
-- Assumes the Open Cloud dev module is in ReplicatedStorage
40+
local ReplicatedStorage = game:GetService("ReplicatedStorage")
41+
local oc = require(ReplicatedStorage.OpenCloud.V2)
42+
43+
local chatEvent = Instance.new("RemoteEvent")
44+
chatEvent.Name = "ChatEvent"
45+
chatEvent.Parent = ReplicatedStorage
46+
47+
-- Form the HTTP request
48+
local requestLLM : oc.GenerateTextRequest = {
49+
path = oc:UniversePath("<your_universe_id>"),
50+
user_prompt = "Tell me about Roblox in under 200 characters.",
51+
system_prompt = "You're extremely polite.",
52+
context_token = "",
53+
max_tokens = 100,
54+
model= "default"
55+
}
56+
57+
local resultLLM : oc.Result<oc.GenerateTextResponse> = oc:GenerateText(requestLLM)
58+
59+
local npc = workspace:WaitForChild("NPCDog") -- Replace with your NPC's name
60+
local head = npc:WaitForChild("Head") -- Ensure your NPC has a Head part
61+
62+
-- Fire the event to display the chat bubble
63+
chatEvent:FireAllClients(head, resultLLM.Response.generated_text)
64+
```
65+
66+
## Text generation API reference
67+
68+
### Request parameters
69+
70+
| Parameter Name | Data Type | Description | Required |
71+
| --- | --- | --- | --- |
72+
| path | string | The path of the universe. Format: `universes/{universe_id}` | Yes |
73+
| user_prompt | string | The prompt from the user that initiates the chat. This could be a question, statement, or command that the user wants the model to respond to. | No |
74+
| system_prompt | string | The system prompt provides context to the model about its role, tone, or behavior during the conversation. This parameter can guide the model on how to respond, setting expectations like `"You are an assistant"` or `"Use a formal tone"`. | No |
75+
| temperature | number | Controls the "creativity" or randomness of the model's responses. Values closer to 1 increase randomness, while values closer to 0 make the responses more focused and deterministic. Default: 0.8 | No |
76+
| top_p | number | Helps the AI model narrow or expand the range of possible words to sample from while generating the next token. This setting narrows the token choices to only contain words that together make up a certain percentage of total likelihood (e.g. 90%). A lower `top_p` means the AI sticks to closer, more predictable choices, while a higher `top_p` opens the door to more diverse and creative responses. Default: 0.4 | No |
77+
| max_tokens | number | The maximum number of tokens in the response generated by the model. This limits the length of the response, preventing overly long or incomplete answers. Default: 1000 | No |
78+
| seed | number | Sets a fixed seed for the random number generator, allowing reproducible responses in cases where the same input parameters are used across multiple requests. By setting the same seed value, you can obtain identical results for debugging, testing, or evaluation purposes. | No |
79+
| context_token | string | Prompt history context token. The context token contains a summarization of the previous prompt requests and responses in a conversation up to the current request. If no token is provided, a new token is generated and returned in the response. Providing a previously generated context token restores the conversation state into the current API request | No |
80+
| model | string | The model and version to use to generate the response. Can be used to override the default model used for text generation. Currently, only the default model is available. | No |
81+
82+
### Response fields
83+
84+
| Field Name | Data Type | Description |
85+
| --- | --- | --- |
86+
| generated_text | string | The generated response. |
87+
| context_token | string | A token containing the summarization of a previously passed context token and the current generated response. This token can be passed into subsequent requests to mantain the state of the current conversation. Subsequent requests generate new tokens with updated conversation state. Extracting the token and providing it maintains the ongoing conversation context. |
88+
| model | string | The model and version that generated the response. |

tools/checks/utils/allowedHttpLinks.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -610,3 +610,4 @@ https://developer.mozilla.org/docs/Web/HTTP/Methods/TRACE
610610
https://developer.mozilla.org/docs/Web/HTTP/Methods/PATCH
611611
https://docs.google.com/forms/d/e/1FAIpQLSfGTRQwATB2wUg0P4HUSTtyXrhptFahJifo1ew84SyqtfSBfg/viewform
612612
https://www.youtube.com/playlist?list=PLuEQ5BB-Z1PJi8RJ7Kuc0JhcT0ubgaL43
613+
https://survey.roblox.com/jfe/form/SV_3QnTMZaFmYe2FTM

0 commit comments

Comments
 (0)