Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ but cannot always guarantee backwards compatibility. Changes that may **break co

**Improved**

- Updated ["Use a TPU" User Guide](https://unit8co.github.io/darts/userguide/gpu_and_tpu_usage.html#use-a-tpu) to recommend newer `pytorch-lightning>=2.5.3` compatible with `torch_xla>=2.7.0` on Google Colab. []() by [Zhihao Dai](https://github.com/daidahao).
- 🔴 Added future and static covariates support to `BlockRNNModel`. This improvement required changes to the underlying model architecture which means that saved model instances from older Darts versions cannot be loaded any longer. [#2845](https://github.com/unit8co/darts/pull/2845) by [Gabriel Margaria](https://github.com/Jaco-Pastorius).

**Fixed**
Expand Down
7 changes: 4 additions & 3 deletions docs/userguide/gpu_and_tpu_usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -170,9 +170,10 @@ There are three main ways to get access to a TPU:

If you are using a TPU in the Google Colab kind of notebook, then you should first install these:
```
!pip install cloud-tpu-client==0.10 https://storage.googleapis.com/tpu-pytorch/wheels/torch_xla-1.9-cp37-cp37m-linux_x86_64.whl
!pip install torch==1.9.0+cu111 torchvision==0.10.0+cu111 torchtext==0.10.0 -f https://download.pytorch.org/whl/cu111/torch_stable.html
!pip install pyyaml==5.4.1
! pip install "pytorch-lightning>=2.5.3"
! pip install --pre torch torchvision --index-url https://download.pytorch.org/whl/nightly/cpu
! pip install 'torch_xla[tpu] @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.9.0.dev-cp312-cp312-linux_x86_64.whl' \
-f https://storage.googleapis.com/libtpu-wheels/index.html
```

and then instruct our model to use a TPU or more. In our example we are using four TPUs, like this:
Expand Down