Code Monkey home page Code Monkey logo

murre's People

Contributors

mikahama avatar mokha avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

murre's Issues

OpenNMT-py versioning / RuntimeError: result type Float can't be cast to the desired output type Long

Hei! I installed murre (1.0.1) through pip install murre (tried pip3 install murre as well) and I cannot get it to work properly.

from murre import normalize_sentence
normalize_sentence("mä syön paljo karkkii")
'opennmt_opts' object has no attribute 'tgt_prefix'

Following the discussion here and experimenting a bit with previous versions of other packages, I managed to get the following versions:

OpenNMT-py 1.0.0
torch 1.7.1
natas 1.0.5

I see that any version of torch before 1.7.1 is unobtainable.

However, this configuration brought - for the first time - a different error message (maybe it will shed some light on it):

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "C:\Users\first.last\AppData\Local\Programs\Python\Python38\lib\site-packages\murre\normalizer.py", line 47, in normalize_sentence
    res = _normalize_chunks(chunks, wnut19_model=wnut19_model)
  File "C:\Users\first.last\AppData\Local\Programs\Python\Python38\lib\site-packages\murre\normalizer.py", line 59, in _normalize_chunks
    res = call_onmt(chunks, model_name, n_best=1)
  File "C:\Users\first.last\AppData\Local\Programs\Python\Python38\lib\site-packages\natas\normalize.py", line 151, in call_onmt
    t.translate(
  File "C:\Users\first.last\AppData\Local\Programs\Python\Python38\lib\site-packages\onmt\translate\translator.py", line 351, in translate
    batch_data = self.translate_batch(
  File "C:\Users\first.last\AppData\Local\Programs\Python\Python38\lib\site-packages\onmt\translate\translator.py", line 546, in translate_batch
    return self._translate_batch_with_strategy(batch, src_vocabs,
  File "C:\Users\first.last\AppData\Local\Programs\Python\Python38\lib\site-packages\onmt\translate\translator.py", line 677, in _translate_batch_with_strategy
    decode_strategy.advance(log_probs, attn)
  File "C:\Users\first.last\AppData\Local\Programs\Python\Python38\lib\site-packages\onmt\translate\beam_search.py", line 190, in advance
    torch.div(self.topk_ids, vocab_size, out=self._batch_index)
RuntimeError: result type Float can't be cast to the desired output type Long

Could you let me know if you know about any post-torch-1.2 configuration of packages that would actually work well with murre and get it to work, please?

Any help/advice would be greatly appreciated! Kiitos etukäteen!

"pip3 install murre" failed because of pyomnttok dependency

Hi,

To reproduce this error:
Ubuntu Linux 5.8.0-63-generic #71~20.04.1-Ubuntu SMP / x86_64 x86_64 x86_64 GNU/Linux
Python 3.8.10 (default, Jun 2 2021, 10:49:15)
[GCC 9.4.0] on linux.

When running pip3 install murre, I got this:

ERROR: Could not find a version that satisfies the requirement pyonmttok==1.10.1 (from murre) (from versions: 1.16.0, 1.16.1, 1.17.0, 1.17.1, 1.17.2, 1.18.0, 1.18.1, 1.18.2, 1.18.3, 1.18.4, 1.18.5, 1.19.0, 1.20.0, 1.21.0, 1.22.0, 1.22.1, 1.22.2, 1.23.0, 1.24.0, 1.25.0, 1.26.0, 1.26.1, 1.26.2, 1.26.3, 1.26.4)
ERROR: No matching distribution found for pyonmttok==1.10.1

I have tried the followings:

  • upgrade pip, use pip and pip3
  • install manually pyonmttok==1.10.1 wheel.

Best regards,
Lee

Not working on MacOS M1

Hello,
After complicated install process on MacOS M1, and trying to make it install, and work...

When running python3 mytest.py

mytest.py

from murre import normalize_sentence
from murre import dialectalize_sentence

dialectalize_sentence("kodin takana on koira", "Pohjois-Savo")
normalize_sentence("mä syön paljo karkkii")

I get the following result:

FileNotFoundError: Op type not registered 'Addons>GatherTree' in binary running on macbooks-MacBook-Pro.local. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
 You may be trying to load on a different device from the computational device. Consider setting the `experimental_io_device` option in `tf.saved_model.LoadOptions` to the io_device such as '/job:localhost'.

Any suggestions on how to make it work ?

Thank you.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.