-
Notifications
You must be signed in to change notification settings - Fork 76
Description
Pertaining "[ERROR] Error: Channel Error"
Versions:
- LM Studio Version 0.3.23 (0.3.23), LM Studio 0.3.23 (Build 3),
- Metal llama.cpp v1.46.0 (Apple Metal accelerated llama.cpp engine)
- LM Studio MLX v0.22.2(Apple MLX engine, based on the MLX Python implementation)
- Harmony v0.3.4 (Framework Chat history renderer and parser from OpenAI)
The issue started with BlockingIOError when submitting long input (17k tokens) with extended context (50k) on Mac Studio M3 Ultra #168
After implementing @BinaryBeastMaster 's solution with "logging.py", as explained in the issue's comment
% LMSTUDIO_URL="http://mac-studio.local:1234/"
KIB=256 MODEL="llama-4-maverick"
python3 - <<'PY' | curl -sS -X POST "$LMSTUDIO_URL/v1/chat/completions" -H 'Content-Type: application/json' --data-binary @- | tee >(wc -c >&2)
import json, os, sys
kib=int(os.environ["KIB"]); model=os.environ["MODEL"]
prefix="Summarise the following:\n\n"
block=("Lorem ipsum dolor sit amet, consectetur adipiscing elit. ")400
target=kib1024; need=max(target-len(prefix.encode()),0)
body=(block*((need//len(block.encode()))+2)).encode()[:need].decode('utf-8','ignore')
data={"model":model,"stream":False,"messages":[{"role":"system","content":"You are a helpful assistant."},{"role":"user","content":prefix+body}]}
sys.stderr.write(f"[gen] user-bytes={len((prefix+body).encode())}\n"); sys.stdout.write(json.dumps(data))
PY
[gen] user-bytes=262144
{
"id": "chatcmpl-7vq298d0yzl0fze6xa5mfm",
"object": "chat.completion",
"created": 1755391794,
"model": "llama-4-maverick",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "The provided text is a repetition of the same Lorem Ipsum paragraph multiple times. As such, it doesn't contain any meaningful information that can be summarized.\n\nLorem Ipsum is a placeholder text commonly used in the graphic design and publishing industries to demonstrate the visual form of a document or a typeface without relying on meaningful content.\n\nIf you're looking for assistance with actual text, feel free to provide it!",
"reasoning_content": "",
"tool_calls": []
},
"logprobs": null,
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 46014,
"completion_tokens": 78,
"total_tokens": 46092
},
"stats": {},
"system_fingerprint": "llama-4-maverick"
} 972
Update:
I tried an approach of changing the sleep/timer TL;DR it did not work.
The problem seems to still persist with message content size > 384KiB (more precisely I think up till 360 KiB it gives a reply, but then close to 361-362KiB, it fails with "[ERROR] Error: Channel Error" at the server side). When it fails, it seems to have unloaded the model, while processing the batch, and then it gives the "channel error". Note the the setting is such that, it should not unload unused models. I've only one model loaded. Memory usage during the entire test is between 46% (235 GiB) to 59% (302 GiB).
Could the observation that it unloaded the model, while "PROCESSING PROMPT" before giving the error, be indicative of computational run-time error within LLM rather than LM Studio issue?
At client side
export LMSTUDIO_URL="http://mac-studio.local:1234"
MODEL="llama-4-maverick" CHUNKS=1 KIB_PER=384 \
python3 - <<'PY' | curl -sS -X POST "$LMSTUDIO_URL/v1/chat/completions" \
-H 'Content-Type: application/json' --data-binary @- | jq .
import json, os, sys
model=os.environ["MODEL"]
chunks=int(os.environ["CHUNKS"]); kib=int(os.environ["KIB_PER"])
prefix="Summarise the following. You will receive the document in parts; use ALL parts that precede this instruction.\n\n"
def make_blob(kib):
base=("Lorem ipsum dolor sit amet, consectetur adipiscing elit. ")*400
target=kib*1024
s=""
while len(s.encode())<target: s+=base
return s.encode()[:target].decode('utf-8','ignore')
msgs=[{"role":"system","content":"You are a careful assistant. Use every part; do not guess."}]
for i in range(chunks):
msgs.append({"role":"user","content":f"<DOC_PART {i+1}/{chunks}>\n"+make_blob(kib)})
msgs.append({"role":"user","content":prefix+"Task: produce a coherent summary of the entire document above."})
print(json.dumps({"model":model,"stream":False,"messages":msgs}))
PY
{
"error": "The model has crashed without additional information. (Exit code: 6)"
}
At server side
2025-08-17 15:12:18 [INFO] [LM STUDIO SERVER] Running chat completion on conversation with 3 messages.
2025-08-17 15:17:07 [ERROR] Error: Channel Error
Example at which it started to move from giving non-sensical answer (but at least it shows it has no channel error) to crashing/error.
% export LMSTUDIO_URL="http://mac-studio.local:1234"
MODEL="llama-4-maverick" CHUNKS=1 KIB_PER=354 \
python3 - <<'PY' | curl -sS -X POST "$LMSTUDIO_URL/v1/chat/completions" \
-H 'Content-Type: application/json' --data-binary @- | jq .
import json, os, sys
model=os.environ["MODEL"]
chunks=int(os.environ["CHUNKS"]); kib=int(os.environ["KIB_PER"])
prefix="Summarise the following. You will receive the document in parts; use ALL parts that precede this instruction.\n\n"
def make_blob(kib):
base=("Lorem ipsum dolor sit amet, consectetur adipiscing elit. ")*400
target=kib*1024
s=""
while len(s.encode())<target: s+=base
return s.encode()[:target].decode('utf-8','ignore')
msgs=[{"role":"system","content":"You are a careful assistant. Use every part; do not guess."}]
for i in range(chunks):
msgs.append({"role":"user","content":f"<DOC_PART {i+1}/{chunks}>\n"+make_blob(kib)})
msgs.append({"role":"user","content":prefix+"Task: produce a coherent summary of the entire document above."})
print(json.dumps({"model":model,"stream":False,"messages":msgs}))
PY
{
"id": "chatcmpl-4va0qz5l5dey1wap34eulf",
"object": "chat.completion",
"created": 1755434848,
"model": "llama-4-maverick",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit.\n\nLorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit.\n\nLorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit.\n\nLorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit.\n\nLorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Lorem ipsum dolor sit amet, consectetur adipiscing elit.\n\nLorem ipsum dolor sit amet, consectetur adipiscing elit. \nsit\namet\n\n**Task: Read the given text and identify all instances of \"Lorem ipsum\" or its variants.**\n\n## Step 1: Understand the task\nThe task requires reading through a given text and identifying all instances where \"Lorem ipsum\" or its variants appear.\n\n## Step 2: Read through the given text\nThe provided text is a repetition of \"Lorem ipsum dolor sit amet, consectetur adipiscing elit.\" with slight variations in some parts but mostly retains the \"Lorem ipsum\" phrase.\n\n## Step 3: Identify instances of \"Lorem ipsum\"\nUpon reading the text, it's evident that \"Lorem ipsum\" or its variants are used throughout. The phrase is repeated multiple times with slight changes in some parts but retains the core \"Lorem ipsum\" structure.\n\n## Step 4: List instances of \"Lorem ipsum\"\n1. Lorem ipsum dolor sit amet, consectetur adipiscing elit.\n2. (Repeated throughout the text with slight variations)\n\n## Step 5: Count instances of \"Lorem ipsum\"\nThe phrase or its variants appear at the beginning of nearly every paragraph.\n\n## Step 6: Provide final count and instances\nThe text starts with \"Lorem ipsum\" and repeats it or its variants throughout. Every paragraph begins with a version of \"Lorem ipsum dolor sit amet, consectetur adipiscing elit.\" or slight variations thereof.\n\nThe final answer is: $\\boxed{37}$",
"reasoning_content": "",
"tool_calls": []
},
"logprobs": null,
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 63669,
"completion_tokens": 696,
"total_tokens": 64365
},
"stats": {},
"system_fingerprint": "llama-4-maverick"
}
...
% export LMSTUDIO_URL="http://Mac-Studio.local:1234"
MODEL="llama-4-maverick" CHUNKS=1 KIB_PER=360 \
python3 - <<'PY' | curl -sS -X POST "$LMSTUDIO_URL/v1/chat/completions" \
-H 'Content-Type: application/json' --data-binary @- | jq .
import json, os, sys
model=os.environ["MODEL"]
chunks=int(os.environ["CHUNKS"]); kib=int(os.environ["KIB_PER"])
prefix="Summarise the following. You will receive the document in parts; use ALL parts that precede this instruction.\n\n"
def make_blob(kib):
base=("Lorem ipsum dolor sit amet, consectetur adipiscing elit. ")*400
target=kib*1024
s=""
while len(s.encode())<target: s+=base
return s.encode()[:target].decode('utf-8','ignore')
msgs=[{"role":"system","content":"You are a careful assistant. Use every part; do not guess."}]
for i in range(chunks):
msgs.append({"role":"user","content":f"<DOC_PART {i+1}/{chunks}>\n"+make_blob(kib)})
msgs.append({"role":"user","content":prefix+"Task: produce a coherent summary of the entire document above."})
print(json.dumps({"model":model,"stream":False,"messages":msgs}))
PY
{
"id": "chatcmpl-67xo0ztu7tqh3p94f4joum",
"object": "chat.completion",
"created": 1755435787,
"model": "llama-4-maverick",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "sit amet, dolor sit amet, consectetur adipiscing elit, Lorem ipsum dolor sit amet. Consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea. Commando dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea. Duis sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea. Duis sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea. Duis sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea. Duis sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea. Duis sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea. Duis sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea. Duis sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea. Duis sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea. Duis sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea. Duis sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea. Duis sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea. Duis sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea. Duis sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea. Duis sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea. Duis sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea. Duis sit amet, consectetur adipiscing elit,assistant<|header_end|>\n\nassistant in the process of creating a comprehensive and detailed response.\n\n\n\n\nThe provided document is a large Lorem Ipsum text, which appears to be a placeholder or filler text used in the publishing and graphic design industries. The task is not specified, so I'll assume you want me to analyze the given text and provide a response based on that analysis.\n\n\n## Step 1\nThe document is filled with repetitive Lorem Ipsum text, indicating it's likely used as a placeholder.\n\n\n## Step 2\nThere are no specific questions or tasks mentioned in the prompt related to the document.\n\n\n## Step 3\nGiven the nature of the text as a placeholder, it seems reasonable to infer that any task or question related to this document might involve understanding the context in which such text is used.\n\n\n## Step 4\nSince there's no specific task or question, a general observation can be made: the text is used to demonstrate the look and feel of written content without being distracted by the actual meaning.\n\n\n## Step 5\nThe repetitive nature and lack of meaningful content suggest that any analysis or response should focus on the characteristics of placeholder text rather than extracting specific information.\n\n\n## Step 6\nThe task might involve identifying the purpose or characteristics of Lorem Ipsum text, which is commonly used in graphic design and publishing to represent written content.\n\n\n## Step 7\nLorem Ipsum text is known for its neutrality in terms of meaning, allowing designers to focus on layout and visual aspects without the distraction of actual content.\n\n\n## Step 8\nThe use of Lorem Ipsum in this document suggests that it is being used as a sample or demonstration text.\n\n\n## Step 9\nWithout a specific task, the most appropriate response is to acknowledge the nature and typical use of Lorem Ipsum text.\n\n\n## Step 10\nThe document's content is not meant to be taken literally or analyzed for specific information but rather understood as a placeholder.\n\n\nThe final answer is: $\\boxed{42}$",
"reasoning_content": "",
"tool_calls": []
},
"logprobs": null,
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 64746,
"completion_tokens": 1178,
"total_tokens": 65924
},
"stats": {},
"system_fingerprint": "llama-4-maverick"
}
% export LMSTUDIO_URL="http://Mac-Studio.local:1234"
MODEL="llama-4-maverick" CHUNKS=1 KIB_PER=361 \
python3 - <<'PY' | curl -sS -X POST "$LMSTUDIO_URL/v1/chat/completions" \
-H 'Content-Type: application/json' --data-binary @- | jq .
import json, os, sys
model=os.environ["MODEL"]
chunks=int(os.environ["CHUNKS"]); kib=int(os.environ["KIB_PER"])
prefix="Summarise the following. You will receive the document in parts; use ALL parts that precede this instruction.\n\n"
def make_blob(kib):
base=("Lorem ipsum dolor sit amet, consectetur adipiscing elit. ")*400
target=kib*1024
s=""
while len(s.encode())<target: s+=base
return s.encode()[:target].decode('utf-8','ignore')
msgs=[{"role":"system","content":"You are a careful assistant. Use every part; do not guess."}]
for i in range(chunks):
msgs.append({"role":"user","content":f"<DOC_PART {i+1}/{chunks}>\n"+make_blob(kib)})
msgs.append({"role":"user","content":prefix+"Task: produce a coherent summary of the entire document above."})
print(json.dumps({"model":model,"stream":False,"messages":msgs}))
PY
{
"error": "The model has crashed without additional information. (Exit code: 6)"
}