Code Monkey home page Code Monkey logo

Comments (2)

nikitakit avatar nikitakit commented on June 13, 2024

The project was in pytorch from the very beginning, but when it came time to do a parser release I wasn't very happy with the tools that pytorch offered. In pytorch (as of version 0.4) you can't distribute models independently of the python code that was used to create them, which causes issues in that (a) the training code requires python 3.6+ and doesn't run on Windows (b) it's very disruptive for me to upgrade pytorch (or other library) versions while I'm actively working on a research project, but a release codebase needs to support new framework versions whenever they come out, and (c) Tensorflow has much better tools for model quantization/compression, which I use to keep model download sizes low.

The nice thing about Tensorflow is that you can compile your model into a computation graph and save the graph to disk, after which it can be loaded and used independently of the python code that created it. I had figured that forward compatibility would be better for the tensorflow on-disk format than for any python code I write. I also tried my best to make sure that the release code runs across a wide range of environments, including Python 2.

At one point I also wanted to write release bindings for languages other than Python, but I never got around to doing that.

from self-attentive-parser.

BramVanroy avatar BramVanroy commented on June 13, 2024

Thank you for the elaborate reply!

Many projects are switching to support >=3.6, and Python 2.x support is being dropped everywhere. Official support ends on January 1st, anyway. This is a good website to get an idea of who's dropping support (spoiler: almost all major packages). I don't think that making your project depend on 3.6 or later is a bad idea - perhaps work towards it for a new release at the end of this year? It will save you a lot of compatibility head aches, I'm sure!

I'm not sure why PyTorch models won't run on Windows. I've been using pretrained models since 0.4.1 and it works fine. It's true that models might not be as small as the TF computation graphs, but one could argue that that shouldn't be too big an issue. The model is only downloaded once. Using computation graphs vs. model states is a discussion that has a lot of pros and cons on both sides. I see that in the field of NLP, many implementations have moved to PyTorch >= 1.0.

All this to say that I understand and of course respect your decision. Thank you again for this package!

from self-attentive-parser.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.