-
Notifications
You must be signed in to change notification settings - Fork 0
Ddulaev/diploma successful exps #3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
refactored sasrec train_sasrec.py to ipynb format (datasphere tested)
tiger_baseline full run added and tested
tensorboards for sasrec & tiger + rq-vae runs
| "d_kv": 64, | ||
| "dropout": 0.1, | ||
| "activation": "relu", | ||
| "num_beams": 100, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Можно уменьшить до 30, это ускорит код и результаты должны не сильно поменяться
| "sampler_type": "tiger" | ||
| }, | ||
| "dataloader": { | ||
| "train_batch_size": 256, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
В твоих экспах ты можешь брать больший batch size, чтобы лучше gpu утилизировать, главное чтобы на всех экспах он был один
| if item_frequency_counts is None: | ||
| # We do not yet know final max, so start conservatively and grow if needed | ||
| item_frequency_counts = {} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Непонятно почему это бы не убрать? В чем логика делать выше None?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Пожалуйста отредачь ноутбуки чтобы их можно было корректно сравнивать, убери свою метаинформацию о выходах и запусках.
No description provided.