Skip to content
This repository was archived by the owner on Jan 2, 2021. It is now read-only.

Commit 02d2fca

Browse files
committed
Corrected value for adversarial loss. Don't refactor math the day after stopping coffee.
1 parent 0c9937a commit 02d2fca

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

README.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -64,7 +64,7 @@ Pre-trained models are provided in the GitHub releases. Training your own is a
6464
6565
# Train the model using an adversarial setup based on [4] below.
6666
python3.4 enhance.py --train "data/*.jpg" --model custom --scales=2 --epochs=250 \
67-
--perceptual-layer=conv5_2 --smoothness-weight=2e4 --adversary-weight=2e5 \
67+
--perceptual-layer=conv5_2 --smoothness-weight=2e4 --adversary-weight=1e3 \
6868
--generator-start=5 --discriminator-start=0 --adversarial-start=5 \
6969
--discriminator-size=64
7070

enhance.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -374,7 +374,7 @@ def loss_total_variation(self, x):
374374
return T.mean(((x[:,:,:-1,:-1] - x[:,:,1:,:-1])**2 + (x[:,:,:-1,:-1] - x[:,:,:-1,1:])**2)**1.25)
375375

376376
def loss_adversarial(self, d):
377-
return T.mean(1.0 - T.nnet.softplus(d[args.batch_size:]))
377+
return T.mean(1.0 - T.nnet.softminus(d[args.batch_size:]))
378378

379379
def loss_discriminator(self, d):
380380
return T.mean(T.nnet.softminus(d[args.batch_size:]) - T.nnet.softplus(d[:args.batch_size]))

0 commit comments

Comments
 (0)