Releases: speediedan/finetuning-scheduler
Releases · speediedan/finetuning-scheduler
Fine-Tuning Scheduler Release 2.4.0
[2.4.0] - 2024-08-15
Added
- Support for Lightning and PyTorch
2.4.0
- Support for Python
3.12
Changed
- Changed default value of the
frozen_bn_track_running_stats
option to the FTS callback constructor toTrue
.
Deprecated
- removed support for PyTorch
2.0
- removed support for Python
3.8
Fine-Tuning Scheduler Patch Release 2.3.3
[2.3.3] - 2024-07-09
- Support for Lightning <=
2.3.3
(includes critical security fixes) and PyTorch <=2.3.1
Fine-Tuning Scheduler Release 2.3.2
[2.3.2] - 2024-07-08
- Support for Lightning <=
2.3.2
and PyTorch <=2.3.1
Thanks to the following users/contributors for their feedback and/or contributions in this release:
@josedvq
Fine-Tuning Scheduler Feature Teaser Release 2.3.0
Note
Because Lightning is not currently planning an official 2.3.0
release, this FTS release is marked as a pre-release and pins a lightning
2.3.0dev
commit. A return to normal Lightning cadence is expected with 2.4.0
and FTS will release accordingly. Installation of this FTS pre-release can either follow the normal installation from source or use the release archive, e.g.:
export FTS_VERSION=2.3.0 && \
wget https://github.com/speediedan/finetuning-scheduler/releases/download/v${FTS_VERSION}-rc1/finetuning_scheduler-${FTS_VERSION}rc1.tar.gz && \
pip install finetuning_scheduler-${FTS_VERSION}rc1.tar.gz
[2.3.0] - 2024-05-17
Added
- Support for Lightning and PyTorch
2.3.0
- Introduced the
frozen_bn_track_running_stats
option to the FTS callback constructor, allowing the user to override the default Lightning behavior that disablestrack_running_stats
when freezing BatchNorm layers. Resolves#13.
Deprecated
- removed support for PyTorch
1.13
Fine-Tuning Scheduler Patch Release 2.2.4
[2.2.4] - 2024-05-04
Added
- Support for Lightning
2.2.4
and PyTorch2.2.2
Fine-Tuning Scheduler Patch Release 2.2.1
[2.2.1] - 2024-03-04
Added
- Support for Lightning
2.2.1
Fine-Tuning Scheduler Release 2.2.0
[2.2.0] - 2024-02-08
Added
- Support for Lightning and PyTorch
2.2.0
- FTS now inspects any base
EarlyStopping
orModelCheckpoint
configuration passed in by the user and applies that configuration when instantiating the required FTS callback dependencies (i.e.,FTSEarlyStopping
orFTSCheckpoint
). Part of the resolution to #12.
Changed
- updated reference to renamed
FSDPPrecision
- increased
jsonargparse
minimum supported version to4.26.1
Fixed
- Explicitly
rank_zero_only
-guardedScheduleImplMixin.save_schedule
andScheduleImplMixin.gen_ft_schedule
. Some codepaths were incorrectly invoking them from non-rank_zero_only
guarded contexts. Resolved #11. - Added a note in the documentation indicating more clearly the behavior of FTS when no monitor metric configuration is provided. Part of the resolution to #12.
Deprecated
- removed support for PyTorch
1.12
- removed legacy FTS examples
Thanks to the following users/contributors for their feedback and/or contributions in this release:
@Davidham3 @jakubMitura14
Fine-Tuning Scheduler Patch Release 2.1.4
[2.1.4] - 2024-02-02
Added
- Support for Lightning
2.1.4
Changed
- Bumped
sphinx
requirement to>5.0,<6.0
Deprecated
- Removed deprecated lr
verbose
init param usage - Removed deprecated
tensorboard.dev
references
Fine-Tuning Scheduler Release 2.1.3
[2.1.3] - 2023-12-21
Added
- Support for Lightning
2.1.3
Fine-Tuning Scheduler Release 2.1.2
[2.1.2] - 2023-12-20
Added
- Support for Lightning
2.1.2
Fixed
- Explicitly
rank_zero_only
-guardedScheduleImplMixin.save_schedule
andScheduleImplMixin.gen_ft_schedule
. Some codepaths were incorrectly invoking them from non-rank_zero_only
guarded contexts. Resolves #11.
Thanks to the following users/contributors for their feedback and/or contributions in this release:
@Davidham3