Skip to content
This repository was archived by the owner on Feb 18, 2026. It is now read-only.

[INTERNAL] RDS-1527: changing defaults so that model params and blueprints match#620

Open
andreatgretel wants to merge 1 commit intodevelopfrom
am/RDS-1527
Open

[INTERNAL] RDS-1527: changing defaults so that model params and blueprints match#620
andreatgretel wants to merge 1 commit intodevelopfrom
am/RDS-1527

Conversation

@andreatgretel
Copy link
Contributor

  1. num_records is always 1000 in all task defaults; in param defaults, it is 5000 for TabFT/TabGAN and 10 for TextFT. Recommendation - set param defaults to 1000 in all cases (this might be slow for TextFT though!)
  2. for TextFT, steps is 750 for the task default; in param defaults, both steps and epochs are empty. I think this is fine - we need to be opinionated about some params in task default, but not necessarily in param defaults
  3. for TextFT, generate.maximum_text_length is 100 in task default and 42 in param default. Also, train.max_tokens is 512 in param default (and empty in task default). Recommendation - set all of these to 128. This is still relatively short, but at least wouldn't affect runtime too much.
  4. for TabGAN, there's a mismatch in almost all parameters: generator_dim and discriminator_dim are 1024x1024 in task and 256x256 in param; generator_lr and discriminator_lr are 2e-4/2e-4 in task and 1e-4, 3.3e-4 in param; epochs is auto in task and 600 in param; batch_size is auto in task and 500 in param. Recommendation - make params all equal to task.

@vercel
Copy link

vercel bot commented Jun 3, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
gretel-blueprints ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jun 3, 2025 2:18pm

@nakolean nakolean changed the base branch from main to develop June 3, 2025 15:03
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants