Skip to content

Commit f2e9941

Browse files
authored
Update db.rst
1 parent 94dabf6 commit f2e9941

File tree

1 file changed

+17
-4
lines changed

1 file changed

+17
-4
lines changed

docs/modules/db.rst

Lines changed: 17 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -134,6 +134,7 @@ it is very strateford to connected to the TensorDB system.
134134
you can try the following code
135135

136136
.. code-block:: python
137+
137138
from tensorlayer.db import TensorDB
138139
db = TensorDB(ip='127.0.0.1', port=27017, db_name='your_db', user_name=None, password=None, studyID='ministMLP')
139140
@@ -165,6 +166,7 @@ methods
165166
suppose we save the log each step and save the parameters each epoch, we can have the code like this
166167

167168
.. code-block:: python
169+
168170
for epoch in range(0,epoch_count):
169171
[~,ac]=sess.run([train_op,loss],feed_dict({x:x,y:y_}
170172
db.train_log({'accuracy':ac})
@@ -181,6 +183,7 @@ it is up to the user to specifiy how to convert the string back to models or job
181183
for example, in many our our cases, we just simpliy specify the python code.
182184
183185
.. code-block:: python
186+
184187
code= '''
185188
print "hello
186189
'''
@@ -220,7 +223,9 @@ the TesorLabDemo has an import data interface, which allow the user to injecting
220223
221224
user can import data by the following code
222225
223-
``db.import_data(X,y,{'type':'train'})``
226+
.. code-block:: python
227+
228+
db.import_data(X,y,{'type':'train'})
224229
225230
226231
@@ -241,26 +246,34 @@ users can based on the TensorLabDemo code, overrite the interface to suits their
241246
when training, the overall archtiecture is
242247
first, find a data generator from the dataset module
243248
244-
``g=datase.data_generator({"type":XXXX})``
249+
.. code-block:: python
250+
251+
g=datase.data_generator({"type":[your type]})
245252
246253
then intialize a model with a name
247254
248-
``m=model('mytes')``
255+
.. code-block:: python
256+
257+
m=model('mytes')
249258
250259
during training, connected the db logger and tensordb togehter
251260
252-
``m.fit_generator(g,dblogger(tensordb,m),1000,100)``
261+
.. code-block:: python
262+
263+
m.fit_generator(g,dblogger(tensordb,m),1000,100)
253264
254265
if the work is distributed, we have to save the model archtiecture and reload and excute it
255266
256267
.. code-block:: python
268+
257269
db.save_model_architecture(code,{'name':'mlp'})
258270
db.push_job({'name':'mlp'},{'type':XXXX},{'batch:1000','epoch':100)
259271
260272
261273
the worker will run the job as the following code
262274
263275
.. code-block:: python
276+
264277
j=job.pop
265278
g=dataset.data_generator(j.filter)
266279
c=tensordb.load_model_architecutre(j.march)

0 commit comments

Comments
 (0)