Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How was weights/hardnet_petite_base.pth trained? #36

Open
blake-varden opened this issue Apr 16, 2020 · 4 comments
Open

How was weights/hardnet_petite_base.pth trained? #36

blake-varden opened this issue Apr 16, 2020 · 4 comments

Comments

@blake-varden
Copy link

Hi, I'm looking to convert this repo to Tensorflow. I believe most of the operations here are standard and it should be relatively straightforward to convert to Tensorflow.

I will however need to regenerate the pretrained weights for hardnet_petite_base. I was wondering if you could share the details of how you trained this?

Is the structure, just the decoder portion of the network followed by an FC layer to the number of classes and then Trained on ImageNet for 100 epochs?

@PingoLH
Copy link
Owner

PingoLH commented Apr 21, 2020

Hi, yes we trained it with ImageNet for the first 100 epochs out of a totally 150 epochs of cosine learning rate schedule.

@blake-varden
Copy link
Author

Hi thank you. And to confirm the architecture of the petitie base is : the decoder + fc layer + softmax to classify the classes?

@PingoLH
Copy link
Owner

PingoLH commented Apr 21, 2020

Yes, just add a fc layer with a cross entropy loss should be fine.

@DonghweeYoon
Copy link

How much hours have you spent for training the model with ImageNet?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants