For some reason, I have to use TIMM package offline. But I found that if I use create_model() , for example:
self.img_encoder = timm.create_model("swin_base_patch4_window7_224", pretrained=True)
I would get
http.client.RemoteDisconnected: Remote end closed connection without response
I found the function wanted to fetch the pre-trained model by the URL below, but it failed. https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window7_224_22kto1k.pth
Can I just download the pre-trained model and load it in my code like in Huggingface? (I have checked the timmdocs but found nothing mentioning this.)
Yes, you can download all models somewhere local. ( all models can be found in the project's release section ).
The on your offline system. put them under:
~/.cache/torch/hub/checkpoints
to be more clear this is the ls
return for the mentioned folder on my computer:
tf_efficientdet_d7x-f390b87c.pth
tf_efficientnet_b0_aa-827b6e33.pth
tf_efficientnet_b7_ra-6c08e654.pth
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.