-
Notifications
You must be signed in to change notification settings - Fork 194
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Graph definition forces input to have batchsize number of images #6
Comments
Fixed this by replacing self.batchsize in the model definition with tf.shape(self.images)[0] and passing an extra argument to conv_transpose():
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Suppose you want to restore the trained/saved model and test new inputs on. The way the model is written makes it inflexible to the number of inputs passed in, you must always pass in a number of examples equal to the batchsize the model was trained on. You can't for example run the model on just one test image, which would be nice.
The text was updated successfully, but these errors were encountered: