Can the WebUI notebook be updated to be compatible with the DreamBooth extension? #1397
Replies: 2 comments
-
looks like your overwriting part over the environment, possible because the webui uses deepspeed and i dont think Bens does. youd be better off running the fast stable diffusion note book instead of the dreambooth and modifying that this way you dont disrupt any of bens dependencies |
Beta Was this translation helpful? Give feedback.
-
The issue with the dreambooth extension within the WebUI is it has quite a lot of requirements specified in the requirements.txt. That said, if you are unable to get it working correctly, I found modifying the colab to skip some of packages that are installed via 7zip and installing with this requirement file worked for me. https://github.com/mediocreatmybest/gaslightingeveryone/blob/main/Colab/files/test_requirements.txt I uploaded this to git as it might be helpful to someone who just wants to use the extension. Zero support, plus it may not work for you, or break stuff, as always check the code before running. That said, I did a complete run and tested today with creating check points and training. All worked ok. Warning, it is slow to install as it builds it via pip. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
When I try to use the WebUI notebook, I have to pip install the dreambooth extension requirements file in order to use the extension. However, when I do so it causes the following warning upon starting the UI:
2023-01-16 16:41:19.083681: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags. 2023-01-16 16:41:21.396197: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer.so.7'; dlerror: libnvinfer.so.7: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: /usr/local/lib/python3.8/dist-packages/cv2/../../lib64:/usr/lib64-nvidia 2023-01-16 16:41:21.396391: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer_plugin.so.7'; dlerror: libnvinfer_plugin.so.7: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: /usr/local/lib/python3.8/dist-packages/cv2/../../lib64:/usr/lib64-nvidia 2023-01-16 16:41:21.396416: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Cannot dlopen some TensorRT libraries. If you would like to use Nvidia GPU with TensorRT, please make sure the missing libraries mentioned above are installed properly.
Then when I try to generate a 2.1 image it throws errors:
RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cpu and cuda:0! (when checking argument for argument index in method wrapper__index_select)
Beta Was this translation helpful? Give feedback.
All reactions