Skip to content

Commit 1b4f1cd

Browse files
authored
Merge pull request #1447 from rstudio/retether-3.3.3
Retether to Keras 3.3.3
2 parents b40657b + e5be5be commit 1b4f1cd

File tree

663 files changed

+3188
-631
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

663 files changed

+3188
-631
lines changed

.tether/man/Constraint.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -52,7 +52,7 @@ class Constraint(builtins.object)
5252
| ----------------------------------------------------------------------
5353
| Class methods defined here:
5454
|
55-
| from_config(config) from builtins.type
55+
| from_config(config)
5656
| Instantiates a weight constraint from a configuration dictionary.
5757
|
5858
| Example:

.tether/man/InputLayer.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@ class InputLayer(keras.src.layers.layer.Layer)
1111
| tensorflow.python.trackable.autotrackable.AutoTrackable
1212
| tensorflow.python.trackable.base.Trackable
1313
| keras.src.ops.operation.Operation
14+
| keras.src.saving.keras_saveable.KerasSaveable
1415
| builtins.object
1516
|
1617
| Methods defined here:

.tether/man/Layer.txt

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
Help on class Layer in module keras.src.layers.layer:
22

3-
class Layer(keras.src.backend.tensorflow.layer.TFLayer, keras.src.ops.operation.Operation)
3+
class Layer(keras.src.backend.tensorflow.layer.TFLayer, keras.src.ops.operation.Operation, keras.src.saving.keras_saveable.KerasSaveable)
44
| Layer(*args, **kwargs)
55
|
66
| This is the class from which all layers inherit.
@@ -161,6 +161,7 @@ class Layer(keras.src.backend.tensorflow.layer.TFLayer, keras.src.ops.operation.
161161
| tensorflow.python.trackable.autotrackable.AutoTrackable
162162
| tensorflow.python.trackable.base.Trackable
163163
| keras.src.ops.operation.Operation
164+
| keras.src.saving.keras_saveable.KerasSaveable
164165
| builtins.object
165166
|
166167
| Methods defined here:

.tether/man/LearningRateSchedule.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -52,7 +52,7 @@ class LearningRateSchedule(builtins.object)
5252
| ----------------------------------------------------------------------
5353
| Class methods defined here:
5454
|
55-
| from_config(config) from builtins.type
55+
| from_config(config)
5656
| Instantiates a `LearningRateSchedule` from its config.
5757
|
5858
| Args:

.tether/man/Loss.txt

Lines changed: 8 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,7 @@
11
Help on class Loss in module keras.src.losses.loss:
22

3-
class Loss(builtins.object)
4-
| Loss(
5-
| name=None,
6-
| reduction='sum_over_batch_size',
7-
| dtype=None
8-
| )
3+
class Loss(keras.src.saving.keras_saveable.KerasSaveable)
4+
| Loss(name=None, reduction='sum_over_batch_size', dtype=None)
95
|
106
| Loss base class.
117
|
@@ -22,6 +18,11 @@ class Loss(builtins.object)
2218
| return ops.mean(ops.square(y_pred - y_true), axis=-1)
2319
| ```
2420
|
21+
| Method resolution order:
22+
| Loss
23+
| keras.src.saving.keras_saveable.KerasSaveable
24+
| builtins.object
25+
|
2526
| Methods defined here:
2627
|
2728
| __call__(
@@ -51,14 +52,6 @@ class Loss(builtins.object)
5152
| ----------------------------------------------------------------------
5253
| Class methods defined here:
5354
|
54-
| from_config(config) from builtins.type
55-
|
56-
| ----------------------------------------------------------------------
57-
| Data descriptors defined here:
58-
|
59-
| __dict__
60-
| dictionary for instance variables
55+
| from_config(config)
6156
|
62-
| __weakref__
63-
| list of weak references to the object
6457

.tether/man/Metric.txt

Lines changed: 7 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
Help on class Metric in module keras.src.metrics.metric:
22

3-
class Metric(builtins.object)
3+
class Metric(keras.src.saving.keras_saveable.KerasSaveable)
44
| Metric(dtype=None, name=None)
55
|
66
| Encapsulates metric logic and state.
@@ -77,6 +77,11 @@ class Metric(builtins.object)
7777
| return self.true_positives
7878
| ```
7979
|
80+
| Method resolution order:
81+
| Metric
82+
| keras.src.saving.keras_saveable.KerasSaveable
83+
| builtins.object
84+
|
8085
| Methods defined here:
8186
|
8287
| __call__(
@@ -159,7 +164,7 @@ class Metric(builtins.object)
159164
| ----------------------------------------------------------------------
160165
| Class methods defined here:
161166
|
162-
| from_config(config) from builtins.type
167+
| from_config(config)
163168
|
164169
| ----------------------------------------------------------------------
165170
| Readonly properties defined here:
@@ -168,12 +173,4 @@ class Metric(builtins.object)
168173
|
169174
| variables
170175
|
171-
| ----------------------------------------------------------------------
172-
| Data descriptors defined here:
173-
|
174-
| __dict__
175-
| dictionary for instance variables
176-
|
177-
| __weakref__
178-
| list of weak references to the object
179176

.tether/man/callback_early_stopping.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,6 @@ class EarlyStopping(keras.src.callbacks.callback.Callback)
4444
| improvement is expected and thus training will not be stopped.
4545
| Defaults to `0`.
4646
|
47-
|
4847
| Example:
4948
|
5049
| >>> callback = keras.callbacks.EarlyStopping(monitor='loss',
@@ -118,3 +117,4 @@ class EarlyStopping(keras.src.callbacks.callback.Callback)
118117
| `on_epoch_end()` is passed to this argument for this method but
119118
| that may change in the future.
120119
|
120+

.tether/man/clone_model.txt

Lines changed: 123 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,123 @@
1+
__signature__
2+
keras.models.clone_model(
3+
model,
4+
input_tensors=None,
5+
clone_function=None,
6+
call_function=None,
7+
recursive=False,
8+
**kwargs
9+
)
10+
__doc__
11+
Clone a Functional or Sequential `Model` instance.
12+
13+
Model cloning is similar to calling a model on new inputs,
14+
except that it creates new layers (and thus new weights) instead
15+
of sharing the weights of the existing layers.
16+
17+
Note that
18+
`clone_model` will not preserve the uniqueness of shared objects within the
19+
model (e.g. a single variable attached to two distinct layers will be
20+
restored as two separate variables).
21+
22+
Args:
23+
model: Instance of `Model`
24+
(could be a Functional model or a Sequential model).
25+
input_tensors: optional list of input tensors or InputLayer objects
26+
to build the model upon. If not provided,
27+
new `Input` objects will be created.
28+
clone_function: Callable with signature `fn(layer)`
29+
to be used to clone each layer in the target
30+
model (except `Input` instances). It takes as argument the
31+
layer instance to be cloned, and returns the corresponding layer
32+
instance to be used in the model copy. If unspecified, this callable
33+
defaults to the following serialization/deserialization function:
34+
`lambda layer: layer.__class__.from_config(layer.get_config())`.
35+
By passing a custom callable, you can customize your copy of the
36+
model, e.g. by wrapping certain layers of interest (you might want
37+
to replace all `LSTM` instances with equivalent
38+
`Bidirectional(LSTM(...))` instances, for example).
39+
Defaults to `None`.
40+
call_function: Callable with signature
41+
`fn(layer, *args, **kwargs)` to be used to call each
42+
cloned layer and a set of inputs. It takes the layer instance,
43+
the call arguments and keyword arguments, and returns the
44+
call outputs. If unspecified, this callable defaults to
45+
the regular `__call__()` method:
46+
`def fn(layer, *args, **kwargs): return layer(*args, **kwargs)`.
47+
By passing a custom callable, you can insert new layers before or
48+
after a given layer. Note: this argument can only be used with
49+
Functional models.
50+
recursive: Boolean. Whether to recursively clone any Sequential
51+
or Functional models encountered in the original
52+
Sequential/Functional model. If `False`,
53+
then inner models are cloned by calling `clone_function()`.
54+
If `True`, then inner models are cloned by calling `clone_model()`
55+
with the same `clone_function`, `call_function`, and `recursive`
56+
arguments. Note that in this case, `call_function`
57+
will not be propagated to any Sequential model
58+
(since it is not applicable to Sequential models).
59+
60+
Returns:
61+
An instance of `Model` reproducing the behavior
62+
of the original model, on top of new inputs tensors,
63+
using newly instantiated weights. The cloned model may behave
64+
differently from the original model if a custom `clone_function`
65+
or `call_function` modifies a layer or layer call.
66+
67+
Example:
68+
69+
```python
70+
# Create a test Sequential model.
71+
model = keras.Sequential([
72+
keras.layers.Input(shape=(728,)),
73+
keras.layers.Dense(32, activation='relu'),
74+
keras.layers.Dense(1, activation='sigmoid'),
75+
])
76+
# Create a copy of the test model (with freshly initialized weights).
77+
new_model = clone_model(model)
78+
```
79+
80+
Using a `clone_function` to make a model deterministic by setting the
81+
random seed everywhere:
82+
83+
```python
84+
def clone_function(layer):
85+
config = layer.get_config()
86+
if "seed" in config:
87+
config["seed"] = 1337
88+
return layer.__class__.from_config(config)
89+
90+
new_model = clone_model(model)
91+
```
92+
93+
Using a `call_function` to add a `Dropout` layer after each `Dense` layer
94+
(without recreating new layers):
95+
96+
```python
97+
def call_function(layer, *args, **kwargs):
98+
out = layer(*args, **kwargs)
99+
if isinstance(layer, keras.layers.Dense):
100+
out = keras.layers.Dropout(0.5)(out)
101+
return out
102+
103+
new_model = clone_model(
104+
model,
105+
clone_function=lambda x: x, # Reuse the same layers.
106+
call_function=call_function,
107+
)
108+
```
109+
110+
Note that subclassed models cannot be cloned by default,
111+
since their internal layer structure is not known.
112+
To achieve equivalent functionality
113+
as `clone_model` in the case of a subclassed model, simply make sure
114+
that the model class implements `get_config()`
115+
(and optionally `from_config()`), and call:
116+
117+
```python
118+
new_model = model.__class__.from_config(model.get_config())
119+
```
120+
121+
In the case of a subclassed model, you cannot using a custom
122+
`clone_function`.
123+

.tether/man/initializer_constant.txt

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -52,7 +52,7 @@ class Constant(keras.src.initializers.initializer.Initializer)
5252
| ----------------------------------------------------------------------
5353
| Class methods defined here:
5454
|
55-
| from_config(config) from builtins.type
55+
| from_config(config)
5656
| Instantiates an initializer from a configuration dictionary.
5757
|
5858
| Example:
@@ -69,3 +69,4 @@ class Constant(keras.src.initializers.initializer.Initializer)
6969
| Returns:
7070
| An `Initializer` instance.
7171
|
72+

.tether/man/keras.losses.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@ CosineSimilarity(
7777
name='cosine_similarity'
7878
)
7979
ctc(y_true, y_pred)
80-
CTC(reduction='sum_over_batch_size', name='sparse_categorical_crossentropy')
80+
CTC(reduction='sum_over_batch_size', name='ctc')
8181
deserialize(name, custom_objects=None)
8282
dice(y_true, y_pred)
8383
Dice(reduction='sum_over_batch_size', name='dice')

0 commit comments

Comments
 (0)