Skip to content

Conversation

theabhirath
Copy link
Member

@theabhirath theabhirath commented Aug 1, 2022

This PR does a bunch of stuff:

  1. Proposes to refine the API at the highest level of models to only expose options that make sense for the model to load pretrained weights (this means removing configurations like drop rates).
  2. Completes the migration towards having inchannels and nclasses as uniform options for all models, helping along Add inchannels, imsize, nclasses as kwargs for all constructors #176 - only imsize is left now.
  3. For the purpose of general code cleanliness, defines a ton of type annotations that weren't there before - these will help along with the documentation (once refactored) to help the users figure out more cryptic errors that may occur when dealing with lower level model APIs.
  4. Does away with default model configurations at the highest level. I can restore this, but this behaviour is rather ambiguous - what's to make a ResNet 50 be more suited for a default configuration than a ResNet 18, or a 101?
  5. Throws in a small refactor of invertedresidual as prep for an EfficientNetv2 PR that will be landing shortly.
  6. Uses create_classifier in most cases where it can be for brevity
  7. Adds compat entries for packages introduced in Overhaul of ResNet API #174 (closes CompatHelper: add new compat entry for CUDA at version 3, (keep existing compat) #191, closes CompatHelper: add new compat entry for ChainRulesCore at version 1, (keep existing compat) #192, closes CompatHelper: add new compat entry for PartialFunctions at version 1, (keep existing compat) #193, closes CompatHelper: add new compat entry for NNlibCUDA at version 0.2, (keep existing compat) #194). Also closes CompatHelper: bump compat for Functors to 0.3, (keep existing compat) #180 because why not

Also
a. more type annotations
b. Expose only configurations vital to the model API in terms of pretraining at the highest level
@theabhirath theabhirath added this to the 0.8 milestone Aug 1, 2022
@theabhirath theabhirath changed the base branch from master to cl/fix August 2, 2022 13:54
@theabhirath theabhirath changed the base branch from cl/fix to master August 2, 2022 13:54
@theabhirath theabhirath requested a review from darsnack August 2, 2022 13:54
@theabhirath
Copy link
Member Author

Pending CI, this should be good to go!

@darsnack darsnack merged commit 7449985 into FluxML:master Aug 3, 2022
@theabhirath theabhirath deleted the refine branch August 4, 2022 11:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

No open projects

Development

Successfully merging this pull request may close these issues.

2 participants