Code Monkey home page Code Monkey logo

quantumtransformers's Introduction

Quantum Transformers

This project explores how the Transformer architecture can be executed on quantum computers. In particular, the focus is on the adaptation of the Vision Transformer for the analysis of high-energy physics data.

The relevance of the work is accentuated by the upcoming start of operation of the High Luminosity Large Hadron Collider (HL-LHC) at the end of this decade. The program will produce enormous quantities of data, which in turn will require vast computing resources. A promising approach for dealing with this huge amount of data could be the application of quantum machine learning (QML), which could reduce the time complexity of classical algorithms by running on quantum computers and obtain better accuracies.

The work has been undertaken by Marçal Comajoan Cara as part of Google Summer of Code 2023 with the ML4SCI organization.

You can read more details about the project in this blog post, which includes a summary of the results and the potential work that could be done in the future building on it.

Structure

The folder structure of the project is as follows:

  • quantum_transformers/: the library code for the quantum transformers, as well as for loading the data (datasets.py) and training the models (training.py).
    • quantum_transformers/qmlperfcomp/: subproject to compare the performance of different quantum machine learning frameworks. In particular, I evaluated PennyLane and TensorCircuit (spoiler: TensorCircuit is much faster).
  • notebooks: the notebooks used for evaluating the models and showing their usage and performance. Each notebook is named after the dataset it uses.
    • visualizations.ipynb: notebook visualizing the image datasets.
    • classical/: classical counterparts as baselines.
    • quantum/: the quantum transformers. Additionally, qvit_cerrat_et_al.ipynb is a notebook trying to reproduce the results of "Quantum Vision Transformers" by Cerrat et al., although unsuccessfully.
  • hpopt/: hyperparameter optimization scripts. The folder contains a README with instructions on how to run them.

Datasets

The architectures have been evaluated on the following datasets:

  • MNIST Digits, as a toy dataset for rapid prototyping
  • Quark-Gluon, one of the main high-energy physics datasets used in the project, which contains images of the recordings of the CMS detector of quark and gluon jets.
  • Electron-Photon, the other main high-energy physics dataset used in the project, which contains images of the recordings of the CMS detector of electron and photon showers.
  • IMDb Reviews, as a toy dataset for evaluating the non-vision transformers for text.

The datasets are downloaded automatically when loading them for the first time. Note that they require a lot of disk space and can take a long time to preprocess.

Installation

First, install Python if you don't have it already. Then, to install the project together with the dependencies, run the following command in the root folder:

pip install -e .

Usage

After installation, you can run the notebooks in the notebooks folder. You can also import the library in your own code (import quantum_transformers).

Acknowledgements

I would like to thank the mentors and fellow contributors from ML4SCI, especially Sergei Gleyzer, for supervising the project and providing guidance and support. I would also like to thank the ML4SCI organization for giving me the opportunity to work on this project, and Google for sponsoring it and supporting the Google Summer of Code program. Likewise, I would also like to thank the United States National Energy Research Scientific Computing Center (NERSC) for providing me with the computing resources to run the experiments. Finally, I also want to thank all the developers of the open-source software that I have used for this project.

License

The project is licensed under the GNU General Public License v3.0.

Contact

If you have any questions, feel free to email me at [email protected].

quantumtransformers's People

Contributors

dependabot[bot] avatar salcc avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.