Code Monkey home page Code Monkey logo

Comments (5)

ZhihaoDU avatar ZhihaoDU commented on August 25, 2024 1

Yes, you are correct. In FunCodec and LauraTTS, we didn't include any optimization on attention operation, since the model is very small.

from funcodec.

ZhihaoDU avatar ZhihaoDU commented on August 25, 2024

Sorry, I can't understand your question, please provide more details to specify the question, thanks.

from funcodec.

MinJ-lucky avatar MinJ-lucky commented on August 25, 2024

My bad, let me explain my question.
As far as I know, big GPT-like models always use some optimization in attention like FlashAttention for less GPU mem cost and improve the forward speed. However, I didn't find any like this, only find a "torch.matmul" at here. I'd like to make sure that I'm correct, or there is optimization somewhere else~

from funcodec.

MinJ-lucky avatar MinJ-lucky commented on August 25, 2024

Thanks for your reply!
After reading the code, I still have two little questions:

  1. Is LauraTTS a totally independent model, which is trained from scratch without any pretrained parameters from LauraGPT?
    2.Since I see LauraGPT paper is also referenced in this repo, also taking into consideration that LauraGPT and LauraTTS has highly similar name, what's the relevance between both?

from funcodec.

ZhihaoDU avatar ZhihaoDU commented on August 25, 2024

Thanks for your reply! After reading the code, I still have two little questions:

  1. Is LauraTTS a totally independent model, which is trained from scratch without any pretrained parameters from LauraGPT?
    2.Since I see LauraGPT paper is also referenced in this repo, also taking into consideration that LauraGPT and LauraTTS has highly similar name, what's the relevance between both?
  1. Yes, LauraTTS is a totally independent model, you can train it from scratch on LibriTTS corpus with the released code.
  2. The reason of LauraGPT paper referencing this repo is because LauraGPT use the same codec model to tokenize speech signals. In fact, LauraTTS can be treated as a improved version of LauraGPT for TTS task only. There are two main differences between LauraTTS and LauraGPT in terms of TTS task: 1) LauraTTS use two codec groups in LM model rather than one. 2) In LauraTTS the LM and NAR model are jointly trained while they are trained separately in LauraGPT.

from funcodec.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.