Skip to content

Commit 3f2b064

Browse files
authored
Update usage.rst
- more detailed note on the effect of temp.
1 parent 1e6db6b commit 3f2b064

File tree

1 file changed

+7
-6
lines changed

1 file changed

+7
-6
lines changed

docs/source/usage.rst

Lines changed: 7 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -265,9 +265,8 @@ For standard usage we recommend the default values (i.e., ``InfoNCE`` and ``cosi
265265

266266
.. rubric:: Temperature :py:attr:`~.CEBRA.temperature`
267267

268-
:py:attr:`~.CEBRA.temperature` has the largest effect on visualization of the embedding (see :py:doc:`cebra-figures/figures/ExtendedDataFigure2`). Hence, it is important that it is fitted to your specific data.
268+
:py:attr:`~.CEBRA.temperature` has the largest effect on *visualization* of the embedding (see :py:doc:`cebra-figures/figures/ExtendedDataFigure2`). Hence, it is important that it is fitted to your specific data. A more smooth embedding will be achieved with a temperature set to 0.1, while 1.0 will be more "clustered".
269269

270-
The simplest way to handle it is to use a *learnable temperature*. For that, set :py:attr:`~.CEBRA.temperature_mode` to ``auto``. :py:attr:`~.CEBRA.temperature` will be trained alongside the model.
271270

272271
🚀 For advance usage, you might need to find the optimal :py:attr:`~.CEBRA.temperature`. For that we recommend to perform a grid-search.
273272

@@ -307,7 +306,8 @@ Here is an example of a CEBRA model initialization:
307306
cebra_model = CEBRA(
308307
model_architecture = "offset10-model",
309308
batch_size = 1024,
310-
temperature_mode="auto",
309+
temperature_mode='constant',
310+
temperature=0.1,
311311
learning_rate = 0.001,
312312
max_iterations = 10,
313313
time_offsets = 10,
@@ -321,7 +321,7 @@ Here is an example of a CEBRA model initialization:
321321
.. testoutput::
322322

323323
CEBRA(batch_size=1024, learning_rate=0.001, max_iterations=10,
324-
model_architecture='offset10-model', temperature_mode='auto',
324+
model_architecture='offset10-model',
325325
time_offsets=10)
326326

327327
.. admonition:: See API docs
@@ -568,7 +568,8 @@ We provide a simple hyperparameters sweep to compare CEBRA models with different
568568
learning_rate = [0.001],
569569
time_offsets = 5,
570570
max_iterations = 5,
571-
temperature_mode = "auto",
571+
temperature_mode='constant',
572+
temperature = 0.1,
572573
verbose = False)
573574

574575
# 2. Define the datasets to iterate over
@@ -820,7 +821,7 @@ It takes a CEBRA model and returns a 2D plot of the loss against the number of i
820821
Displaying the temperature
821822
""""""""""""""""""""""""""
822823

823-
:py:attr:`~.CEBRA.temperature` has the largest effect on the visualization of the embedding. Hence it might be interesting to check its evolution when ``temperature_mode=auto``.
824+
:py:attr:`~.CEBRA.temperature` has the largest effect on the visualization of the embedding. Hence it might be interesting to check its evolution when ``temperature_mode=auto``. We recommend only using `auto` if you have first explored the `constant` setting.
824825

825826
To that extend, you can use the function :py:func:`~.plot_temperature`.
826827

0 commit comments

Comments
 (0)