You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+17-3Lines changed: 17 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -21,18 +21,32 @@ And a big thanks to all GitHub sponsors who helped with some of my costs before
21
21
22
22
## What's New
23
23
24
-
* ❗Updates after Oct 10, 2022 are available in 0.8.x pre-releases (`pip install --pre timm`) or cloning main❗
25
-
* Stable releases are 0.6.x and available by normal pip install or clone from [0.6.x](https://github.com/rwightman/pytorch-image-models/tree/0.6.x) branch.
24
+
❗Updates after Oct 10, 2022 are available in version >= 0.9❗
25
+
* Many changes since the last 0.6.x stable releases. They were previewed in 0.8.x dev releases but not everyone transitioned.
26
+
*`timm.models.layers` moved to `timm.layers`:
27
+
*`from timm.models.layers import name` will still work via deprecation mapping (but please transition to `timm.layers`).
28
+
*`import timm.models.layers.module` or `from timm.models.layers.module import name` needs to be changed now.
29
+
* Builder, helper, non-model modules in `timm.models` have a `_` prefix added, ie `timm.models.helpers` -> `timm.models._helpers`, there are temporary deprecation mapping files but those will be removed.
30
+
* All models now support `architecture.pretrained_tag` naming (ex `resnet50.rsb_a1`).
31
+
* The pretrained_tag is the specific weight variant (different head) for the architecture.
32
+
* Using just using `architecture` uses the 'default' pretrained tag (first instance in default_cfgs for that arch).
33
+
* In adding pretrained tags, many model names that existed to differentiate were renamed to use the tag (ex: `vit_base_patch16_224_in21k` -> `vit_base_patch16_224.augreg_in21k`). There are deprecation mappings for these.
34
+
* The Hugging Face Hub (https://huggingface.co/timm) is now the primary source for `timm` weights. Model cards include link to papers, original source, license.
35
+
* Previous 0.6.x can be cloned from [0.6.x](https://github.com/rwightman/pytorch-image-models/tree/0.6.x) branch or installed via pip with version.
36
+
37
+
### May 11, 2023
38
+
*`timm` 0.9 released, transition from 0.8.xdev releases
26
39
27
40
### May 10, 2023
28
-
* Hugging Face Hub downloading is now default, 1129 models on https://huggingface.co/timm, 1160 weights in `timm`
41
+
* Hugging Face Hub downloading is now default, 1132 models on https://huggingface.co/timm, 1163 weights in `timm`
29
42
* DINOv2 vit feature backbone weights added thanks to [Leng Yue](https://github.com/leng-yue)
* Experimental `get_intermediate_layers` function on vit/deit models for grabbing hidden states (inspired by DINO impl). This is WIP and may change significantly... feedback welcome.
34
47
* Model creation throws error if `pretrained=True` and no weights exist (instead of continuing with random initialization)
35
48
* Fix regression with inception / nasnet TF sourced weights with 1001 classes in original classifiers
49
+
* bitsandbytes (https://github.com/TimDettmers/bitsandbytes) optimizers added to factory, use `bnb` prefix, ie `bnbadam8bit`
36
50
* Misc cleanup and fixes
37
51
* Final testing before switching to a 0.9 and bringing `timm` out of pre-release state
0 commit comments