Comments (9)
Hey @prabhakar267,
Sorry you're experiencing some issues. Can you tell me which version of pytorch you are using? I cannot reproduce the issue with openkiwi==0.1.2
and torch > 1.1
.
from openkiwi.
torch==1.4.0
from openkiwi.
Sorry for the late response. Unfortunately I haven't had the opportunity to test this. However I believe this issue has been fixed on #44. Can you try to install openkiwi from master?
I will test this shortly and tag a new version if my suspicions are confirmed.
from openkiwi.
@captainvera I installed from master branch, the error message changed
Traceback (most recent call last):
File "using_kiwi_gpu.py", line 50, in <module>
'target': target_texts
File "/home/ubuntu/anaconda3/envs/kiwi/lib/python3.6/site-packages/kiwi/predictors/predictor.py", line 106, in predict
return self.run(dataset, batch_size)
File "/home/ubuntu/anaconda3/envs/kiwi/lib/python3.6/site-packages/kiwi/predictors/predictor.py", line 116, in run
model_pred = self.model.predict(batch)
File "/home/ubuntu/anaconda3/envs/kiwi/lib/python3.6/site-packages/kiwi/models/model.py", line 119, in predict
model_out = self(batch)
File "/home/ubuntu/anaconda3/envs/kiwi/lib/python3.6/site-packages/torch/nn/modules/module.py", line 547, in __call__
result = self.forward(*input, **kwargs)
File "/home/ubuntu/anaconda3/envs/kiwi/lib/python3.6/site-packages/kiwi/models/predictor_estimator.py", line 324, in forward
model_out_tgt = self.predictor_tgt(batch)
File "/home/ubuntu/anaconda3/envs/kiwi/lib/python3.6/site-packages/torch/nn/modules/module.py", line 547, in __call__
result = self.forward(*input, **kwargs)
File "/home/ubuntu/anaconda3/envs/kiwi/lib/python3.6/site-packages/kiwi/models/predictor.py", line 243, in forward
source_embeddings = self.embedding_source(source)
File "/home/ubuntu/anaconda3/envs/kiwi/lib/python3.6/site-packages/torch/nn/modules/module.py", line 547, in __call__
result = self.forward(*input, **kwargs)
File "/home/ubuntu/anaconda3/envs/kiwi/lib/python3.6/site-packages/torch/nn/modules/sparse.py", line 114, in forward
self.norm_type, self.scale_grad_by_freq, self.sparse)
File "/home/ubuntu/anaconda3/envs/kiwi/lib/python3.6/site-packages/torch/nn/functional.py", line 1467, in embedding
return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse)
RuntimeError: Expected object of backend CPU but got backend CUDA for argument #3 'index'
from openkiwi.
Hey @prabhakar267,
What I did to try to reproduce your issue:
I installed openkiwi from master in a new virtual env. Downloaded the pretrained models available on the releases.
Used this config:
seed: 42
gpu-id: 0
debug: True
model: estimator
load-model: pre_trained/estimator/target_1/model.torch
wmt18-format: False
test-source: data/wmt19/en_de.nmt/test.src
test-target: data/wmt19/en_de.nmt/test.mt
valid-batch-size: 16
output-dir: tmp_out
I had no errors and was able to run the whole thing.
I then uninstalled openkiwi and pip installed it. Using the same config+model I was able to reproduce your original issue, confirming my suspicions. I'll tag a new version to fix the pip package.
On the other hand, your final error is kind of weird. The bug popped up in a completely different place. To help me reproduce it, can you tell me how you're calling kiwi? And can you try to use one of the pre-trained models to see if they work?
from openkiwi.
I'm getting the error using the predictor-estimator model.
model = kiwi.load_model('trained_models/estimator_en_de.torch/estimator_en_de.torch')
model._device = torch.device("cuda")
source_texts = ...
target_texts = ...
predictions = model.predict({
'source': source_texts,
'target': target_texts
})
print(predictions)
from openkiwi.
Hey @prabhakar267 , sorry for the late response. I was away from work.
Thanks for sending the example script! I wasn't considering that use case and though you were using kiwi as a terminal tool, not as a python package.
The variable you are calling model (like we do on our examples) is actually from the Predicter
class. Furthermore, our code wasn't ready for a change of the _device
variable after initialization.
What you would have to do is:
model = kiwi.load_model('trained_models/estimator_en_de.torch/estimator_en_de.torch')
model.model.to("cuda")
model._device = torch.device("cuda")
Of course this makes to sense from a design perspective. There's a PR #61 that adds an interface similar to PyTorch.
Once that's merged you can do:
model = kiwi.load_model('trained_models/estimator_en_de.torch/estimator_en_de.torch')
mode.to("cuda")
With this, all predictions will be made using the GPU.
Hope this solves your issue.
edit: closed issue by mistake
from openkiwi.
Hey @prabhakar267,
Changes have been merged to master :) let me know if you have any other problem,
Cheers
from openkiwi.
@prabhakar267, let us know if this is not solved.
from openkiwi.
Related Issues (20)
- TypeError: cannot unpack non-iterable NoneType object HOT 1
- The prediction process is not complete by Predictor Estimator. HOT 5
- OpenKiwi always download the tokenizer files for XLMRoberta even if a local path is configured. HOT 2
- Do openKiwi have confident score? HOT 1
- Error Pre-Training Predictor: "model -> encoder -> encode_source extra fields not permitted (type=value_error.extra)" HOT 1
- some confusions
- pkgutil.iter_modules() error: 'PosixPath' object has no attribute 'startswith'
- Got exception when import kiwi
- Seems that maximum token support for a sentence is 512?
- PicklingError: Can't pickle <class 'kiwi.data.encoders.wmt_qe_data_encoder.InputFields[PositiveInt]'>: attribute lookup InputFields[PositiveInt] on kiwi.data.encoders.wmt_qe_data_encoder failed HOT 2
- Do you need to tokenize your data when using a BERT/ROBERTA model?
- Pretrain config file
- What are source_pos and target_pos in the train_config.yaml?
- Why does it need "--model" paramter when I give a specific config? HOT 2
- What languages do the OpenKiwi support?
- some problems about data without alignments HOT 11
- I suppose that the code comment should be remove. HOT 2
- Error at Predictor Training: "Predictor is not a subclass of QESystem" HOT 2
- OSError: Can't load weights for 'xlm-roberta-base'. HOT 16
- open cannot unpack non-iterable NoneType object HOT 16
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from openkiwi.