This is my first post on Github. I hope that I have provided enough information.
I was trying to replicate and adapt the code for finding the Hessian-Vector product for my CNN Model, as done in “[1901.10159] An Investigation into Neural Net Optimization via Hessian Eigenvalue Density”.
Then, I noticed that it throws an error; “LookupError: gradient registry has no entry for: ResizeBilinearGrad”. After some debugging and searching, I found that the problem was with the Keras Upsampling2D layer in my Model. The first derivative calculations are working perfectly, but when I try to calculate the second gradients (derivative of the first gradients), it throws this error.
I found that the gradient function for the ‘ResizeBilinearGrad’ op is not implemented in TF and that this is causing the said error. I found that, custom gradients can be registered using ‘@tf.RegisterGradient(“ResizeBilinearGrad”)’, but I don’t know what exactly it should return. The required gradient function should have the form as given below:
def _ResizeBilinearGradGrad(op, grad):
grad - gradients w.r.t to the outputs of ResizeBilinearGrad op
op - ‘ResizeBilinearGrad’
A - gradients w.r.t first input to the op
B - gradients w.r.t input image
The inputs to the ‘ResizeBilinearGrad’ op are:
- gradients calculated till that point
- the original input image
Can someone provide some direction to this problem?