Skip to content

Error when ticking 'Use LLM LoRA to avoid censored' with Joycaption Pre-Alpha #10

@keca090

Description

@keca090

Traceback (most recent call last):
File "C:\joycaption\venv\lib\site-packages\gradio\queueing.py", line 625, in process_events
response = await route_utils.call_process_api(
File "C:\joycaption\venv\lib\site-packages\gradio\route_utils.py", line 322, in call_process_api
output = await app.get_blocks().process_api(
File "C:\joycaption\venv\lib\site-packages\gradio\blocks.py", line 2103, in process_api
result = await self.call_function(
File "C:\joycaption\venv\lib\site-packages\gradio\blocks.py", line 1650, in call_function
prediction = await anyio.to_thread.run_sync( # type: ignore
File "C:\joycaption\venv\lib\site-packages\anyio\to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
File "C:\joycaption\venv\lib\site-packages\anyio_backends_asyncio.py", line 2461, in run_sync_in_worker_thread
return await future
File "C:\joycaption\venv\lib\site-packages\anyio_backends_asyncio.py", line 962, in run
result = context.run(func, *args)
File "C:\joycaption\venv\lib\site-packages\gradio\utils.py", line 890, in wrapper
response = f(*args, **kwargs)
File "C:\joycaption\wd_llm_caption\gui.py", line 619, in caption_models_load
caption_init.download_models(args)
File "C:\joycaption\wd_llm_caption\caption.py", line 132, in download_models
self.llm_models_paths = download_models(
File "C:\joycaption\wd_llm_caption\utils\download.py", line 245, in download_models
llm_patch_path = Path(os.path.dirname(models_path[3]))
IndexError: list index out of range

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions