Skip to content
Open
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 11 additions & 0 deletions sample.lua
Original file line number Diff line number Diff line change
Expand Up @@ -13,12 +13,23 @@ cmd:option('-temperature', 1)
cmd:option('-gpu', 0)
cmd:option('-gpu_backend', 'cuda')
cmd:option('-verbose', 0)
cmd:option('-seed', 0)
local opt = cmd:parse(arg)


local checkpoint = torch.load(opt.checkpoint)
local model = checkpoint.model

if opt.seed == 0 then
opt.seed = torch.random()
end
torch.manualSeed(opt.seed)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might be simpler to force a manual seed only if -seed argument is set:

if <flag set>:
  torch.manualSeed(...)
end

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree that this code looks odd. The reason I did this is to make sure that if you see some output that you would like to see again, and if you had the verbose flag set to report the seed, you can re-run by then including the seed - even if you didn't originally set the seed flag. So the code always sets a seed - using your provided value if you set one and a random number if you didn't (or specified zero!). I thought this was clever ;-)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah okay that makes sense.


local msg
msg = string.format('Random number seed: %d', opt.seed)
if opt.verbose == 1 then print(msg) end


local msg
if opt.gpu >= 0 and opt.gpu_backend == 'cuda' then
require 'cutorch'
Expand Down