Skip to content

Commit 3b8bade

Browse files
committed
init learning_rate_map when input learning rate is a Variable
1 parent 28ff1cd commit 3b8bade

File tree

1 file changed

+3
-0
lines changed

1 file changed

+3
-0
lines changed

python/paddle/v2/fluid/optimizer.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -45,6 +45,9 @@ def __init__(self, learning_rate, global_step=None, regularization=None):
4545
# each program should have a independent learning rate
4646
# program -> Variable(learning_rate)
4747
self._learning_rate_map = defaultdict(lambda: None)
48+
if isinstance(self._learning_rate, framework.Variable):
49+
self._learning_rate_map[framework.default_main_program(
50+
)] = self._learning_rate
4851
# Dictionary of accumulators. Some optimizer subclasses need to
4952
# allocate and manage extra variables associated with the parameters
5053
# to train. These variables are called accumulators.

0 commit comments

Comments
 (0)