Skip to content

Commit ab8f86d

Browse files
committed
Release 1.5.1 for TF1.2
1 parent 884f6ed commit ab8f86d

File tree

8 files changed

+21
-676
lines changed

8 files changed

+21
-676
lines changed

README.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,7 @@
1818
TensorLayer is a deep learning and reinforcement learning library based on [Google TensorFlow](https://www.tensorflow.org). It provides rich data pre-processing, training, post-processing and serving modules that help researchers and engineers in building complex machine learning workflows.
1919

2020
# What's New
21+
* Support TensorFlow 1.2
2122
* Join [Slack](https://join.slack.com/tensorlayer/shared_invite/MTk1MjM0NDk5MDg5LTE0OTcwOTQyNTEtODFjY2QzYjdmZQ) Now.
2223
* [Attention Seq2seq](https://github.com/zsdonghao/tensorlayer/issues/164) help wanted.
2324
* Support [Sub-pixel Convolution](http://tensorlayer.readthedocs.io/en/latest/modules/layers.html#super-resolution-layer) for Super-resolution.

docs/conf.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -67,9 +67,9 @@
6767
# built documents.
6868
#
6969
# The short X.Y version.
70-
version = '1.5.0'
70+
version = '1.5.1'
7171
# The full version, including alpha/beta/rc tags.
72-
release = '1.5.0'
72+
release = '1.5.1'
7373

7474
# The language for content autogenerated by Sphinx. Refer to documentation
7575
# for a list of supported languages.
@@ -143,7 +143,7 @@
143143
# The name for this set of Sphinx documents.
144144
# "<project> v<release> documentation" by default.
145145
#
146-
# html_title = 'TensorLayer v1.5.0'
146+
# html_title = 'TensorLayer v1.5.1'
147147

148148
# A shorter title for the navigation bar. Default is the same as html_title.
149149
#

example/_tf0.12/tutorial_dynamic_rnn.py

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,8 @@
2626
X[1,6:] = 0
2727
X_lengths = [10, 6]
2828

29-
cell = tf.nn.rnn_cell.LSTMCell(num_units=64, state_is_tuple=True)
29+
cell =tf.contrib.rnn.BasicLSTMCell(num_units=64, state_is_tuple=True)
30+
# cell = tf.nn.rnn_cell.LSTMCell(num_units=64, state_is_tuple=True) # TF 0.12
3031

3132
outputs, last_states = tf.nn.dynamic_rnn(
3233
cell=cell,
@@ -47,7 +48,8 @@
4748
## 2. How to control the initial state
4849
batch_size = X.shape[0]
4950
with tf.variable_scope('name') as vs:
50-
cell = tf.nn.rnn_cell.LSTMCell(num_units=64, state_is_tuple=True)
51+
cell =tf.contrib.rnn.BasicLSTMCell(num_units=64, state_is_tuple=True)
52+
# cell = tf.nn.rnn_cell.LSTMCell(num_units=64, state_is_tuple=True) # TF 0.12
5153
initial_state = cell.zero_state(batch_size, dtype=tf.float64)#"float")
5254
outputs, last_states = tf.nn.dynamic_rnn(
5355
cell=cell,
@@ -84,11 +86,12 @@ def retrieve_seq_length_op(data):
8486
return length
8587

8688
sequence_length = retrieve_seq_length_op(
87-
incoming if isinstance(X, tf.Tensor) else tf.pack(X))
89+
incoming if isinstance(X, tf.Tensor) else tf.stack(X))#tf.pack(X))
8890

8991
batch_size = X.shape[0]
9092
with tf.variable_scope('name2') as vs: #, initializer=tf.constant_initializer(value=0.1)) as vs:
91-
cell = tf.nn.rnn_cell.LSTMCell(num_units=64, state_is_tuple=True)
93+
cell =tf.contrib.rnn.BasicLSTMCell(num_units=64, state_is_tuple=True)
94+
# cell = tf.nn.rnn_cell.LSTMCell(num_units=64, state_is_tuple=True) # TF 0.12
9295
initial_state = cell.zero_state(batch_size, dtype=tf.float64)#"float")
9396
outputs, last_states = tf.nn.dynamic_rnn(
9497
cell=cell,
@@ -106,11 +109,11 @@ def retrieve_seq_length_op(data):
106109
# print(sequence_length)
107110
# exit()
108111
# automatically get the last output
109-
outputs = tf.transpose(tf.pack(outputs), [1, 0, 2])
112+
outputs = tf.transpose(tf.stack(outputs), [1, 0, 2]) #outputs = tf.transpose(tf.pack(outputs), [1, 0, 2])
110113
last_outputs = advanced_indexing_op(outputs, sequence_length)
111114
last_states = result[0]["last_states"]
112115
sess = tf.Session()
113-
sess.run(tf.initialize_all_variables())
116+
tl.layers.initialize_global_variables(sess)
114117
# print('last outputs',sess.run(last_outputs)) # (2, 64) # TO DO
115118
# print('last lstm states',last_states, last_states.c.shape, last_states.h.shape)
116119

example/tutorial_dynamic_rnn.py

Lines changed: 0 additions & 167 deletions
This file was deleted.

0 commit comments

Comments
 (0)