Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

✨ Enable regression & misc #27

Merged
merged 73 commits into from
Jul 8, 2023
Merged

✨ Enable regression & misc #27

merged 73 commits into from
Jul 8, 2023

Conversation

o-laurent
Copy link
Contributor

@o-laurent o-laurent commented Jun 13, 2023

  • Compress GNNL and NNL
  • Make the MLP more flexible
  • Do we really need to add a parameter to the CLI (regression vs. classification)? Why not use dm for this?
  • Add VGGs
  • Add LeNet
  • Add a setup.py when poetry is not available + mention it on the installation page
  • Improve the MLP baseline
  • Add ensemble regression
  • How to deal with mutlivariate regression cases? (different output) -> Ask for the dimension first
  • Fix Bayesian layers? (cf failed tests: 🐛 Bayesian layers may fail during tests #28)
  • Add bayesian models & losses
  • Add deep ensembles method
  • Add a Bayesian tutorial
  • Fix tutorial generation
  • Make tutorial gallery static
  • Add a pretrained tutorial
  • Add a regression tutorial
  • Add tests
  • Add docs
  • Improve VGG optim recipes?

Done for now:
Add an arg to CLI to split regression and cls 🔨 Modify tests accordingly ✔️
Add a regression recipe ✨
Add a small MLP model ✨
Add an MLP baseline ✨
Add a UCIRegression Datamodule ✨
Add GaussianNLL lightning metric ✨
Add an experiment ✨

@o-laurent o-laurent self-assigned this Jun 13, 2023
Add an arg to CLI to split regression and cls 🔨
Modify tests accordingly :check_mark:
Add a regression recipe ✨
Add a small MLP model ✨
Add an MLP baseline ✨
Add a UCIRegression Datamodule ✨
Add GaussianNLL lightning metric ✨
Add an experiment ✨
@o-laurent o-laurent changed the title Start the regression branch ✨ Enable regression training ✨ Jun 16, 2023
@o-laurent o-laurent changed the title Enable regression training ✨ Enable regression training & misc ✨ Jun 23, 2023
@o-laurent o-laurent changed the title Enable regression training & misc ✨ Enable regression & misc ✨ Jun 23, 2023
@o-laurent o-laurent marked this pull request as ready for review July 4, 2023 14:49
@o-laurent
Copy link
Contributor Author

o-laurent commented Jul 4, 2023

Documentation is missing but the code seems approx fine.
I'm working on the doc now.

@o-laurent o-laurent requested a review from alafage July 4, 2023 14:50
@o-laurent o-laurent changed the title Enable regression & misc ✨ ✨ Enable regression & misc Jul 4, 2023
experiments/classification/mnist/bayesian_lenet.py Outdated Show resolved Hide resolved
torch_uncertainty/optimization_procedures.py Show resolved Hide resolved
torch_uncertainty/models/vgg/packed.py Show resolved Hide resolved
torch_uncertainty/models/utils.py Outdated Show resolved Hide resolved
torch_uncertainty/models/lenet.py Outdated Show resolved Hide resolved
torch_uncertainty/metrics/nll.py Show resolved Hide resolved
torch_uncertainty/layers/bayesian_layers/bayes_linear.py Outdated Show resolved Hide resolved
torch_uncertainty/datasets/uci_regression.py Show resolved Hide resolved
torch_uncertainty/baselines/regression/mlp.py Outdated Show resolved Hide resolved
Copy link
Contributor

@alafage alafage left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems to me that there are typos in some datamodules when defining both train and validation sets using random_split. Otherwise this pull request is fine enough to be merged to dev :)

torch_uncertainty/datamodules/cifar10.py Outdated Show resolved Hide resolved
@alafage alafage merged commit 7d4aecb into dev Jul 8, 2023
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants