Skip to content

gbx_lm.manage和gbx_lm.fastapi_server入口报错 #10

@anrgct

Description

@anrgct

模型就在目录下,manage和fastapi_server入口报错

╭─   ~/llm_models/GreenBitAI                                            took  8s  gbai_mlx_lm
╰─❯ python -m gbx_lm.generate --model Qwen-3-30B-A3B-layer-mix-bpw-4.0-mlx  --max-tokens 100 --prompt "calculate 4*8+1024="
==========
<think>
Okay, let's see. The user wants me to calculate 4 times 8 plus 1024. Hmm, I need to remember the order of operations here. So, according to PEMDAS, multiplication comes before addition. That means I should do 4 multiplied by 8 first.

Alright, 4 times 8 is 32. Then I add that result to 1024. So 32 plus 1024. Let
==========
Prompt: 53 tokens, 55.770 tokens-per-sec
Generation: 100 tokens, 83.670 tokens-per-sec
Peak memory: 17.239 GB
╭─   ~/llm_models/GreenBitAI                                           took  15s  gbai_mlx_lm
╰─❯ python -m gbx_lm.manage --scan
Traceback (most recent call last):
  File "/Users/anrgct/miniforge3/envs/gbai_mlx_lm/lib/python3.9/runpy.py", line 197, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/Users/anrgct/miniforge3/envs/gbai_mlx_lm/lib/python3.9/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/Users/anrgct/workspace/gbx-lm/gbx_lm/manage.py", line 4, in <module>
    from transformers.commands.user import tabulate
ModuleNotFoundError: No module named 'transformers.commands.user'
╭─   ~/llm_models/GreenBitAI                                                       gbai_mlx_lm
╰─❯ python -m gbx_lm.fastapi_server --model ./Qwen-3-30B-A3B-layer-mix-bpw-4.0-mlx
2025-05-13 20:05:59,343 - gbx_server - INFO - Starting GBX-Model API server. Log file: /Users/anrgct/workspace/gbx-lm/logs/server_20250513_200559.log
2025-05-13 20:05:59,343 - gbx_server - ERROR - Error loading confidence scorers: No module named 'gbx_lm.routing.libra_router.ue_router'
2025-05-13 20:05:59,343 - gbx_server - INFO - Using gbx-lm to load model from ./Qwen-3-30B-A3B-layer-mix-bpw-4.0-mlx
2025-05-13 20:05:59,344 - gbx_server - ERROR - Invalid model path: Qwen-3-30B-A3B-layer-mix-bpw-4.0-mlx
2025-05-13 20:05:59,344 - gbx_server - ERROR - Model path validation failed: Local models must be relative to the current working dir.
2025-05-13 20:05:59,344 - gbx_server - ERROR - Failed to load model from ./Qwen-3-30B-A3B-layer-mix-bpw-4.0-mlx: Local models must be relative to the current working dir.
2025-05-13 20:05:59,344 - gbx_server - ERROR - Failed to initialize ModelProvider: Local models must be relative to the current working dir.
Traceback (most recent call last):
  File "/Users/anrgct/workspace/gbx-lm/gbx_lm/fastapi_server.py", line 812, in main
    app, server_config, logger = create_app(args)
  File "/Users/anrgct/workspace/gbx-lm/gbx_lm/fastapi_server.py", line 300, in create_app
    model_provider = ModelProvider(server_config.model_config)
  File "/Users/anrgct/workspace/gbx-lm/gbx_lm/fastapi_server.py", line 179, in __init__
    self.load(self.cli_args.model)
  File "/Users/anrgct/workspace/gbx-lm/gbx_lm/fastapi_server.py", line 221, in load
    self._validate_model_path(model_path)
  File "/Users/anrgct/workspace/gbx-lm/gbx_lm/fastapi_server.py", line 189, in _validate_model_path
    raise RuntimeError("Local models must be relative to the current working dir.")
RuntimeError: Local models must be relative to the current working dir.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/anrgct/miniforge3/envs/gbai_mlx_lm/lib/python3.9/runpy.py", line 197, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/Users/anrgct/miniforge3/envs/gbai_mlx_lm/lib/python3.9/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/Users/anrgct/workspace/gbx-lm/gbx_lm/fastapi_server.py", line 820, in <module>
    main()
  File "/Users/anrgct/workspace/gbx-lm/gbx_lm/fastapi_server.py", line 816, in main
    logger.error(f"Server startup failed: {str(e)}")
UnboundLocalError: local variable 'logger' referenced before assignment

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions