Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Keras SQRT implementation #1

Open
gowthamkpr opened this issue Sep 10, 2019 · 0 comments
Open

Keras SQRT implementation #1

gowthamkpr opened this issue Sep 10, 2019 · 0 comments

Comments

@gowthamkpr
Copy link

Hi all,

I am having some warning from theano NaNGuard complaining about np.inf being a very large number ;)

Seems that the source of this warning is the way backend.sqrt is implemented in Keras in the different backends.
This is the tensorflow implementation (comments off):

keras/keras/backend/tensorflow_backend.py

Line 1568 in a0e90bd

def sqrt(x):
def sqrt(x):
zero = _to_tensor(0., x.dtype.base_dtype)
inf = _to_tensor(np.inf, x.dtype.base_dtype)
x = tf.clip_by_value(x, zero, inf)
return tf.sqrt(x)
For the sqrt case, Wouldn't it be simpler to use tf.maximum (x, zero) instead of tf.clip_by_value which evaluates both limits in tensorflow (more complex graph with clip which makes two ops)?

https://github.com/tensorflow/tensorflow/blob/a6d8ffae097d0132989ae4688d224121ec6d8f35/tensorflow/python/ops/clip_ops.py#L39

This applies also to theano and probably to some activation functions like relu etc.

I don't really understand the rationale behind this protection (use of clip to ensure the argument is positive). As a developer, if I design a component or a optimizer that is bad at this point (log of negative values) , I prefer getting an explicit error message from lower-level layer (tensorflow, theano). CNTK backend does not protect and that seems OK.
Thanks in advance for your feedback.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant