Comments (5)
If there's a workaround in the interim I'd be very happy to hear. I tried manually constructing the diff vector but then you end up with an array/list of FloatNodes
and you can't seem to do much with them - e.g. taking a sqrt.
In [16]: def diff(x):
...: dx = [x[0]]
...: for idx in range(1, x.size):
...: dx.append(x[idx] - x[idx-1])
...: return np.asarray(dx)
In [17]: # gradient of dummy function which uses `diff`
...: g = grad(lambda x: np.sqrt(diff(x)).sum())
In [18]: g(randn(10))
Traceback (most recent call last):
File "<ipython-input-18-acc7bd9d5211>", line 1, in <module>
g(randn(10))
File "C:\Anaconda3\lib\site-packages\autograd\core.py", line 21, in gradfun
return backward_pass(*forward_pass(fun,args,kwargs,argnum))
File "C:\Anaconda3\lib\site-packages\autograd\core.py", line 63, in forward_pass
except Exception as e: add_extra_error_message(e)
File "C:\Anaconda3\lib\site-packages\autograd\core.py", line 392, in add_extra_error_message
raise_(etype, value, traceback)
File "C:\Anaconda3\lib\site-packages\future\utils\__init__.py", line 414, in raise_
raise exc.with_traceback(tb)
File "C:\Anaconda3\lib\site-packages\autograd\core.py", line 62, in forward_pass
try: end_node = fun(*args, **kwargs)
File "<ipython-input-17-a063cc27b22f>", line 2, in <lambda>
g = grad(lambda x: np.sqrt(diff(x)).sum())
File "C:\Anaconda3\lib\site-packages\autograd\core.py", line 163, in __call__
result = self.fun(*argvals, **kwargs)
AttributeError: 'FloatNode' object has no attribute 'sqrt'
from autograd.
For reference, here's a workaround that works but can't be very efficient:
def diff(x):
D = np.diag(np.ones(x.size)) - np.diag(np.ones(x.size-1), k=-1)
return np.dot(D, x)
from autograd.
I think that new commit b5e5cde should do it. Reopen the issue if there's something amiss.
I don't think we have a mailing list or gitter chat room. I've never used a gitter chat room. Think that would be a good idea?
Take a look at the commit to see how I implemented the gradient of diff
, though it's not particularly slick or anything: diff
is a linear operation, namely multiplication by a matrix like this one on the left:
[ 1, -1, 0, 0]
[ 0, 1, -1, 0]
[ 0, 0, 1, -1]
so to the compute the reverse-mode gradient we need to multiply the incoming gradient by that matrix on the right. Multiplying that matrix on the right looks like computing something like np.concatenate((-g[0], -np.diff(g), g[-1])).
from autograd.
Thanks, @mattjj - that was fast! I can confirm that it works for me.
Yeah, my ugly workaround was to construct the finite-difference matrix and then let autograd do its magic!
from autograd.
Gitter seems to be the open source tool to use for chat-rooms linked up with GitHub. IMHO it can be handy to have a higher bandwidth forum than opening GitHub issues and it's better than personal emails because the content is archived and searchable so others can learn from the conversations so hopefully you don't have to repeat yourself. A google-groups forum is another option and arguably better as the conversations are threaded so easier to follow.
I think all three forums have their place but certainly having either gitter or google-groups in addition to GitHub issues would be useful in fostering discussions with an outside community.
Obviously as the authors you should feel free to use (or not) whatever tools make your lives/work easier.
from autograd.
Related Issues (20)
- support for Jax-like custom forward pass definition? HOT 1
- Gradient become Nan for 0 value test HOT 1
- Is it possible to see gradient function? HOT 2
- Four scipy tests are failing HOT 6
- Add np.float128,np.complex256 dtypes to vspaces and boxes registers HOT 2
- unsafe URL HOT 1
- Numpy 1.25 breaks a few linalg functions HOT 1
- `autograd` 1.6 breaks Apple M-series macOS and Windows builds (module `numpy` has no attribute `float128`) HOT 3
- Can I differentiate this function?
- Python 2 and dependency on future HOT 2
- `'ArrayBox' object has no attribute 'dot'` when differentiating function containing `x.dot(y)` HOT 1
- How do I create a scalar value that does not depend on the independent variables ?
- AttributeError: module 'autograd.numpy' has no attribute 'numpy_extra'
- Autograd for quantum circuits
- [BUG] Differentiating `autograd.numpy.linalg.norm` gives incorrect results
- Support for advanced library based on autograd HOT 4
- Release new version 1.6.3 on PyPI HOT 3
- autograd return nan with to norm function
- Saving `ArrayBox` to hdf5 file
- Incompatibility with numpy 2.0.0 HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from autograd.