Comments (6)
I studied the codes these days, and I thought you can use the torch.repeat_interleave. Such as follow:
hidden = tuple([torch.repeat_interleave(h, self.k, dim=1) for h in encoder_hidden])
inflated_encoder_outputs = torch.repeat_interleave(encoder_outputs, self.k, dim=0)
from pytorch-seq2seq.
hi, I am studying the code and have similar doubts. However, can you be clear what you mean by decoder_output? do you actually mean log_softmax_output?
from pytorch-seq2seq.
@JojoFisherman Yeah, I mean the output probability of decoder, i.e. log_softmax_output.
from pytorch-seq2seq.
I have the same question. It surprised me that no one has answered this. If theres really something wrong in the beam search, surely it will output some weird sequence. Do you have any conclusion about this?
from pytorch-seq2seq.
It seems some issues have referred that beam search doesn't work correctly. Unfortunately, maybe this repo is not active maintained now. Currently, I use fairseq (pytorch version) to conduct some related experiments.
from pytorch-seq2seq.
I studied the codes these days, and I thought you can use the torch.repeat_interleave. Such as follow:
hidden = tuple([torch.repeat_interleave(h, self.k, dim=1) for h in encoder_hidden])
inflated_encoder_outputs = torch.repeat_interleave(encoder_outputs, self.k, dim=0)
I had the problem with batch_size > 1, but after applying this comment, then it works now.
Thank you!!
from pytorch-seq2seq.
Related Issues (20)
- RuntimeError occurs running integration_test.py HOT 1
- How is the memory optimized when using pre-trained embeddings like FastText and etc? HOT 1
- Dropout error using external embeddings HOT 3
- Dev branch: toy training stops after 2 epochs HOT 2
- Main advantages of develop branch? HOT 1
- beam search
- Decode function in decoder HOT 2
- GPU error when run sample code HOT 4
- RuntimeError when running Samplescript without attention in the develop branch HOT 2
- Memory leak HOT 2
- Error for cuda and cpu HOT 8
- .travis.yml: The 'sudo' tag is now deprecated in Travis CI
- pre-trained word embedding HOT 1
- Teacher forcing during beam decoding
- The dimension of predicted_softmax in DecoderRNN.py
- Out of memory for NLLLoss even the batch size is small
- Teacher forcing per timestep? HOT 1
- About section
- AttributeError: module 'torchtext.data' has no attribute 'Field' HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from pytorch-seq2seq.