Skip to content

Commit 1a1f0d8

Browse files
MatteB03ndem0
authored andcommitted
Update Tensorboard use
1 parent 5183379 commit 1a1f0d8

File tree

6 files changed

+58
-137
lines changed

6 files changed

+58
-137
lines changed

tutorials/tutorial1/tutorial.ipynb

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -505,7 +505,7 @@
505505
},
506506
{
507507
"cell_type": "code",
508-
"execution_count": 10,
508+
"execution_count": null,
509509
"id": "fcac93e4",
510510
"metadata": {},
511511
"outputs": [
@@ -546,10 +546,8 @@
546546
}
547547
],
548548
"source": [
549-
"# Load the TensorBoard extension\n",
550-
"%load_ext tensorboard\n",
551-
"# Show saved losses\n",
552-
"%tensorboard --logdir 'tutorial_logs'"
549+
"print('\\nTo load TensorBoard run load_ext tensorboard on your terminal')\n",
550+
"print(\"To visualize the loss you can run tensorboard --logdir 'tutorial_logs' on your terminal\\n\")"
553551
]
554552
},
555553
{

tutorials/tutorial1/tutorial.py

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -261,13 +261,11 @@ def truth_solution(self, pts):
261261

262262
# The solution is overlapped with the actual one, and they are barely indistinguishable. We can also take a look at the loss using `TensorBoard`:
263263

264-
# In[10]:
264+
# In[ ]:
265265

266266

267-
# Load the TensorBoard extension
268-
get_ipython().run_line_magic('load_ext', 'tensorboard')
269-
# Show saved losses
270-
get_ipython().run_line_magic('tensorboard', "--logdir 'tutorial_logs'")
267+
print('\nTo load TensorBoard run load_ext tensorboard on your terminal')
268+
print("To visualize the loss you can run tensorboard --logdir 'tutorial_logs' on your terminal\n")
271269

272270

273271
# As we can see the loss has not reached a minimum, suggesting that we could train for longer! Alternatively, we can also take look at the loss using callbacks. Here we use `MetricTracker` from `pina.callback`:

tutorials/tutorial2/tutorial.ipynb

Lines changed: 20 additions & 38 deletions
Large diffs are not rendered by default.

tutorials/tutorial2/tutorial.py

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -311,12 +311,11 @@ def forward(self, x):
311311

312312
# Let us compare the training losses for the various types of training
313313

314-
# In[10]:
314+
# In[ ]:
315315

316316

317-
# Load the TensorBoard extension
318-
get_ipython().run_line_magic('load_ext', 'tensorboard')
319-
get_ipython().run_line_magic('tensorboard', "--logdir 'tutorial_logs'")
317+
print('To load TensorBoard run load_ext tensorboard on your terminal')
318+
print("To visualize the loss you can run tensorboard --logdir 'tutorial_logs' on your terminal")
320319

321320

322321
# ## What's next?

tutorials/tutorial3/tutorial.ipynb

Lines changed: 23 additions & 78 deletions
Large diffs are not rendered by default.

tutorials/tutorial3/tutorial.py

Lines changed: 6 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -194,12 +194,11 @@ def forward(self, x):
194194

195195
# Let's now plot the logging to see how the losses vary during training. For this, we will use `TensorBoard`.
196196

197-
# In[5]:
197+
# In[ ]:
198198

199199

200-
# Load the TensorBoard extension
201-
get_ipython().run_line_magic('load_ext', 'tensorboard')
202-
get_ipython().run_line_magic('tensorboard', "--logdir 'tutorial_logs'")
200+
print('\nTo load TensorBoard run load_ext tensorboard on your terminal')
201+
print("To visualize the loss you can run tensorboard --logdir 'tutorial_logs' on your terminal\n")
203202

204203

205204
# Notice that the loss on the boundaries of the spatial domain is exactly zero, as expected! After the training is completed one can now plot some results using the `matplotlib`. We plot the predicted output on the left side, the true solution at the center and the difference on the right side using the `plot_solution` function.
@@ -335,12 +334,12 @@ def forward(self, x):
335334
plot_solution(solver=pinn, time=1)
336335

337336

338-
# We can see now that the results are way better! This is due to the fact that previously the network was not learning correctly the initial conditon, leading to a poor solution when time evolved. By imposing the initial condition the network is able to correctly solve the problem. We can also see using Tensorboard how the two losses decreased:
337+
# We can see now that the results are way better! This is due to the fact that previously the network was not learning correctly the initial conditon, leading to a poor solution when time evolved. By imposing the initial condition the network is able to correctly solve the problem. We can also see how the two losses decreased using Tensorboard.
339338

340-
# In[11]:
339+
# In[ ]:
341340

342341

343-
get_ipython().run_line_magic('tensorboard', "--logdir 'tutorial_logs'")
342+
print("To visualize the loss you can run tensorboard --logdir 'tutorial_logs' on your terminal")
344343

345344

346345
# ## What's next?

0 commit comments

Comments
 (0)