diff --git a/CHANGELOG.md b/CHANGELOG.md index 9f922f4a70..217012d240 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -11,6 +11,7 @@ but cannot always guarantee backwards compatibility. Changes that may **break co **Improved** +- Updated ["Use a TPU" User Guide](https://unit8co.github.io/darts/userguide/gpu_and_tpu_usage.html#use-a-tpu) to recommend newer `pytorch-lightning>=2.5.3` compatible with `torch_xla>=2.7.0` on Google Colab. []() by [Zhihao Dai](https://github.com/daidahao). - 🔴 Added future and static covariates support to `BlockRNNModel`. This improvement required changes to the underlying model architecture which means that saved model instances from older Darts versions cannot be loaded any longer. [#2845](https://github.com/unit8co/darts/pull/2845) by [Gabriel Margaria](https://github.com/Jaco-Pastorius). **Fixed** diff --git a/docs/userguide/gpu_and_tpu_usage.md b/docs/userguide/gpu_and_tpu_usage.md index 02f646d49f..3ad5093319 100644 --- a/docs/userguide/gpu_and_tpu_usage.md +++ b/docs/userguide/gpu_and_tpu_usage.md @@ -170,9 +170,10 @@ There are three main ways to get access to a TPU: If you are using a TPU in the Google Colab kind of notebook, then you should first install these: ``` -!pip install cloud-tpu-client==0.10 https://storage.googleapis.com/tpu-pytorch/wheels/torch_xla-1.9-cp37-cp37m-linux_x86_64.whl -!pip install torch==1.9.0+cu111 torchvision==0.10.0+cu111 torchtext==0.10.0 -f https://download.pytorch.org/whl/cu111/torch_stable.html -!pip install pyyaml==5.4.1 +! pip install "pytorch-lightning>=2.5.3" +! pip install --pre torch torchvision --index-url https://download.pytorch.org/whl/nightly/cpu +! pip install 'torch_xla[tpu] @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.9.0.dev-cp312-cp312-linux_x86_64.whl' \ + -f https://storage.googleapis.com/libtpu-wheels/index.html ``` and then instruct our model to use a TPU or more. In our example we are using four TPUs, like this: