Code Monkey home page Code Monkey logo

einconv's Introduction

Einconv: Convolutions Through the Lens of Tensor Networks

This package offers einsum-based implementations of convolutions and related operations in PyTorch.

Its name is inspired by this Github repository which represented the starting point for our work.

Installation

Install from PyPI via pip

pip install einconv

Examples

Features & Usage

In general, einconv's goals are:

  • Full hyper-parameter support (stride, padding, dilation, groups, etc.)
  • Support for any dimension (e.g. 5d-convolution)
  • Optimizations via symbolic simplification

Modules

einconv provides einsum-based implementations of the following PyTorch modules:

torch module einconv module
nn.Conv{1,2,3}d modules.ConvNd
nn.Unfold modules.UnfoldNd

They work in exactly the same way as their PyTorch equivalents.

Functionals

einconv provides einsum-based implementations of the following PyTorch functionals:

torch functional einconv functional
nn.functional.conv{1,2,3}d functionals.convNd
nn.functional.unfold functionals.unfoldNd

They work in exactly the same way as their PyTorch equivalents.

Einsum Expressions

einconv can generate einsum expressions (equation, operands, and output shape) for the following operations:

  • Forward pass of N-dimensional convolution
  • Backward pass (input and weight VJPs) of N-dimensional convolution
  • Input unfolding (im2col/unfold) for inputs of N-dimensional convolution
  • Input-based Kronecker factors of Fisher approximations for convolutions (KFC and KFAC-reduce)

These can then be evaluated with einsum. For instance, the einsum expression for the forward pass of an N-dimensional convolution is

from torch import einsum
from einconv.expressions import convNd_forward

equation, operands, shape = convNd_forward.einsum_expression(...)
result = einsum(equation, *operands).reshape(shape)

All expressions follow this pattern.

Symbolic Simplification

Some operations (e.g. dense convolutions) can be optimized via symbolic simplifications. This is turned on by default as it generally improves performance. You can also generate a non-optimized expression and simplify it:

from einconv import simplify

equation, operands, shape = convNd_forward.einsum_expression(..., simplify=False)
equation, operands = simplify(equation, operands)
result = einsum(equation, *operands).reshape(shape)

Sometimes it might be better to inspect the non-simplified expression to see how indices relate to operands.

Citation

If you find the einconv package useful for your research, consider mentioning the accompanying article

@article{dangel2023convolutions,
  title =        {Convolutions Through the Lens of Tensor Networks},
  author =       {Dangel, Felix},
  year =         2023,
}

Limitations

  • Currently, none of the underlying operations (computation of index pattern tensors, generation of einsum equations and shapes, simplification) is cached. This consumes additional time, although it should usually take much less time than evaluating an expression via einsum.

  • At the moment, the code to perform expression simplifications is coupled with PyTorch. I am planning to address this in the future by switching the implementation to a symbolic approach which will also allow efficient caching.

einconv's People

Contributors

f-dangel avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

einconv's Issues

[REQ] Exclude `test` directory from package

This used to work by incorporating the following two lines into setup.cfg:

[options.packages.find]
exclude = test*

However, this stopped working, and I haven't yet identified the reason why. Removing these two lines fixes the installation. Waiting for a solution. For now, I will uncomment the two lines. But this should be fixed before the next release.

Specifically, this raises the following error when installing the package:

configparser.DuplicateSectionError: While reading from 'setup.cfg' [line 44]: section 'options.packages.find' already exists

Support for ConvTransposeNd

Hello, thanks for this super cool project!

I'd really like to try this out, but my network architecture requires a transposed convolution.

Is it possible to support this as well?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.