Comments (6)
By the way, both x and y are vectors with float values
from autograd.
Note that your function func(x,y)
returns a vector instead of a scalar. Therefore, you need to use the jacobian
function in autograd and not grad
. Example code:
import autograd.numpy as np
from autograd import jacobian
def f(x, y):
return np.abs(x - y)
jacobian_f_wrt_x = jacobian(f, 0) # 0 indicates first input element in f(x, y)
jacobian_f_wrt_y = jacobian(f, 1) # 1 indicates second input element in f(x, y)
x = np.arange(-4, 6, 2, dtype=float)
y = np.zeros(5) + 1.5
print(jacobian_f_wrt_x(x, y))
print(jacobian_f_wrt_y(x, y))
Here jacobian_f_wrt_x(x, y)
returns the partial derivative of each element in x with respect to each element in the output of f(x, y)
. Because x
contains 5 items and f(x,y)
returns 5 items, this will be a 5x5 matrix. The (i,j)-th element in this matrix corresponds to the partial derivative of output[i]
with respect to x[j]
.
Likewise, jacobian_f_wrt_y(x, y)
returns the partial derivative of each element in y with respect to each element in the output of f(x,y)
.
I hope this helps.
from autograd.
Thank you very much!
Current I use elementwise_grad instead of jacobian. I found elementwise_grad return the vector that is what I want. However, jacobian, return a matrix (d*d). I checked the values in jacobian. They are the same as elementwise_grad.
So I think element wise_grad satisfy my need. Am I correct?
from autograd.
Yes and no. The jacobian
function gives you the complete jacobian
of your vector function f
, with respect to the argument x
or y
. See: https://en.wikipedia.org/wiki/Jacobian_matrix_and_determinant). What the elementwis_grad
function does is take the partial derivative of output[i]
with respect to input[i]
. These partial derivates are equivalent to the diagonal elements of the Jacobian matrix. In your function abs(x - y)
all the off-diagonal elements are zero. However, this does not mean that the Jacobian matrix and elementwise_grad
give the same output, as the latter only returns the diagonal of the Jacobian matrix.
Again, I can probably best illustrate this with an example, by adding an interaction between x
and y
to the function:
import autograd.numpy as np
from autograd import jacobian
def f(x, y):
return np.abs(x - y + np.dot(x,y))
jacobian_f_wrt_x = jacobian(f, 0) # 0 indicates first input element in f(x, y)
jacobian_f_wrt_y = jacobian(f, 1) # 1 indicates second input element in f(x, y)
x = np.arange(-4, 6, 2, dtype=float)
y = np.zeros(5) + 1.5
print(jacobian_f_wrt_x(x, y))
print(jacobian_f_wrt_y(x, y))
Now the output has changed to
[[-2.5 -1.5 -1.5 -1.5 -1.5]
[-1.5 -2.5 -1.5 -1.5 -1.5]
[-1.5 -1.5 -2.5 -1.5 -1.5]
[ 1.5 1.5 1.5 2.5 1.5]
[ 1.5 1.5 1.5 1.5 2.5]]
[[ 5. 2. 0. -2. -4.]
[ 4. 3. 0. -2. -4.]
[ 4. 2. 1. -2. -4.]
[-4. -2. 0. 1. 4.]
[-4. -2. 0. 2. 3.]]
Hope this helps.
from autograd.
Thanks for you greatful explanations. Now I know what should I do in my project.
from autograd.
Thanks for the explanations, @brunojacobs! Indeed as pointed out, elementwise_grad
is a special case of jacobian
.
I thought I'd chime in with one more idea: you might try gluing x
and y
together in a tuple so you can get both of their gradients at once:
import autograd.numpy as np
from autograd import grad, elementwise_grad
def f((x, y)):
return np.abs(x - y)
x = np.arange(-4, 6, 2, dtype=float)
y = np.zeros(5) + 1.5
print elementwise_grad(f)((x, y))
In [1]: run issue66
(array([-1., -1., -1., 1., 1.]), array([ 1., 1., 1., -1., -1.]))
In Python 3 you might have to do something like
def f(xy):
x, y = xy
return np.abs(x - y)
from autograd.
Related Issues (20)
- support for Jax-like custom forward pass definition? HOT 1
- Gradient become Nan for 0 value test HOT 1
- Is it possible to see gradient function? HOT 2
- Four scipy tests are failing HOT 6
- Add np.float128,np.complex256 dtypes to vspaces and boxes registers HOT 2
- unsafe URL HOT 1
- Numpy 1.25 breaks a few linalg functions HOT 1
- `autograd` 1.6 breaks Apple M-series macOS and Windows builds (module `numpy` has no attribute `float128`) HOT 3
- Can I differentiate this function?
- Python 2 and dependency on future HOT 2
- `'ArrayBox' object has no attribute 'dot'` when differentiating function containing `x.dot(y)` HOT 1
- How do I create a scalar value that does not depend on the independent variables ?
- AttributeError: module 'autograd.numpy' has no attribute 'numpy_extra'
- Autograd for quantum circuits
- [BUG] Differentiating `autograd.numpy.linalg.norm` gives incorrect results
- Support for advanced library based on autograd HOT 4
- Release new version 1.6.3 on PyPI HOT 3
- autograd return nan with to norm function
- Saving `ArrayBox` to hdf5 file
- Incompatibility with numpy 2.0.0 HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from autograd.