Skip to content

Commit 758d0ff

Browse files
update docs
1 parent 3f5b3ae commit 758d0ff

File tree

54 files changed

+1825
-336
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

54 files changed

+1825
-336
lines changed

adapt/feature_based/_deep.py

Lines changed: 12 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -562,7 +562,10 @@ class DANN(BaseDeepFeature):
562562
classification DA but it could be widen to other task in
563563
**supervised** DA straightforwardly.
564564
565-
.. image:: ../_static/images/dann.png
565+
.. figure:: ../_static/images/dann.png
566+
:align: center
567+
568+
DANN architecture (source: [1])
566569
567570
Parameters
568571
----------
@@ -811,7 +814,10 @@ class ADDA(BaseDeepFeature):
811814
classification DA but it could be widen to other task in **supervised**
812815
DA straightforwardly.
813816
814-
.. image:: ../_static/images/adda.png
817+
.. figure:: ../_static/images/adda.png
818+
:align: center
819+
820+
Overview of the ADDA approach (source: [1])
815821
816822
Parameters
817823
----------
@@ -1305,7 +1311,10 @@ class DeepCORAL(BaseDeepFeature):
13051311
Notice that DeepCORAL only uses labeled source and unlabeled target
13061312
data. It belongs then to "unsupervised" domain adaptation methods.
13071313
1308-
.. image:: ../_static/images/deepcoral.png
1314+
.. figure:: ../_static/images/deepcoral.png
1315+
:align: center
1316+
1317+
DeepCORAL architecture (source: [1])
13091318
13101319
Parameters
13111320
----------

adapt/feature_based/_fe.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -90,8 +90,8 @@ class FE:
9090
.. [1] `[1] <https://arxiv.org/pdf/0907.1815\
9191
.pdf>`_ Daume III, H. "Frustratingly easy domain adaptation". In ACL, 2007.
9292
93-
Note
94-
----
93+
Notes
94+
-----
9595
FE can be used for multi-source DA by giving list of source data
9696
for arguments Xs, ys of fit method : Xs = [Xs1, Xs2, ...],
9797
ys = [ys1, ys2, ...]

adapt/parameter_based/_regular.py

Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -431,6 +431,12 @@ class RegularTransferNN:
431431
neural network through the ``lambdas`` parameter.
432432
Some layers can also be frozen during training via
433433
the ``training`` parameter.
434+
435+
.. figure:: ../_static/images/regulartransfer.png
436+
:align: center
437+
438+
Transferring parameters of a CNN pretrained on Imagenet
439+
(source: [2])
434440
435441
Parameters
436442
----------
@@ -482,6 +488,32 @@ class RegularTransferNN:
482488
history of the losses and metrics across the epochs
483489
of the network training.
484490
491+
Examples
492+
--------
493+
>>> import numpy as np
494+
>>> import tensorflow as tf
495+
>>> from adapt.parameter_based import RegularTransferNN
496+
>>> np.random.seed(0)
497+
>>> tf.random.set_seed(0)
498+
>>> Xs = np.random.randn(50) * 0.1
499+
>>> Xs = np.concatenate((Xs, Xs + 1.))
500+
>>> Xt = np.random.randn(100) * 0.1
501+
>>> ys = (np.array([-0.2 * x if x<0.5 else 1. for x in Xs])
502+
... + 0.1 * np.random.randn(100))
503+
>>> yt = 0.75 * Xt + 0.1 * np.random.randn(100)
504+
>>> model = tf.keras.Sequential()
505+
>>> model.add(tf.keras.layers.Dense(1))
506+
>>> model.compile(optimizer="adam", loss="mse")
507+
>>> model.predict(Xt.reshape(-1,1))
508+
>>> model.fit(Xs.reshape(-1, 1), ys, epochs=300, verbose=0)
509+
>>> np.abs(model.predict(Xt).ravel() - yt).mean()
510+
0.48265...
511+
>>> rt = RegularTransferNN(model, lambdas=0.01, random_state=0)
512+
>>> rt.fit(Xt[:10], yt[:10], epochs=300, verbose=0)
513+
>>> rt.predict(Xt.reshape(-1, 1))
514+
>>> np.abs(rt.predict(Xt).ravel() - yt).mean()
515+
0.23114...
516+
485517
See also
486518
--------
487519
RegularTransferLR, RegularTransferLC

docs/.buildinfo

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
# Sphinx build info version 1
22
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
3-
config: 5cef7bb97a2b4045581541c4416376d8
3+
config: d55b6138e9bd8e9e2ea847d162fd62a2
44
tags: 645f666f9bcd5a90fca523b33c5a78b7

docs/_images/adda.png

83.1 KB
Loading

docs/_images/dann.png

60.2 KB
Loading

docs/_images/deepcoral.png

166 KB
Loading

docs/_images/regulartransfer.png

183 KB
Loading

docs/_sources/contents.rst.txt

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -55,6 +55,7 @@ and **target** distributions. The **task** is then learned in this **encoded fea
5555
:toctree: generated/
5656
:template: class.rst
5757

58+
feature_based.BaseDeepFeature
5859
feature_based.FE
5960
feature_based.CORAL
6061
feature_based.DeepCORAL
@@ -156,3 +157,5 @@ This module contains utility functions used in the previous modules.
156157
utils.GradientHandler
157158
utils.make_classification_da
158159
utils.make_regression_da
160+
161+
:ref:`Two_moons.ipynb`

docs/_sources/generated/adapt.feature_based.ADDA.rst.txt

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -16,18 +16,24 @@
1616
.. autosummary::
1717

1818
~ADDA.__init__
19+
~ADDA.create_model
1920
~ADDA.fit
2021
~ADDA.fit_source
2122
~ADDA.fit_target
23+
~ADDA.get_loss
24+
~ADDA.get_metrics
2225
~ADDA.predict
2326
~ADDA.predict_disc
2427
~ADDA.predict_features
2528

2629

2730
.. automethod:: __init__
31+
.. automethod:: create_model
2832
.. automethod:: fit
2933
.. automethod:: fit_source
3034
.. automethod:: fit_target
35+
.. automethod:: get_loss
36+
.. automethod:: get_metrics
3137
.. automethod:: predict
3238
.. automethod:: predict_disc
3339
.. automethod:: predict_features

0 commit comments

Comments
 (0)