Code Monkey home page Code Monkey logo

Comments (9)

giulianoformisano avatar giulianoformisano commented on August 23, 2024 1

@MuraliRamRavipati thank you so much! Downgrading torch to 1.12.1 and torchvision to 0.13.1 fixed the issue.

I created a new environment (python 3.8.8) and followed MuraliRamRavipati's suggestion.

from m3inference.

computermacgyver avatar computermacgyver commented on August 23, 2024

Can you please follow these directions to install pytorch
https://pytorch.org/get-started/locally/
?

If it still does not work, please provide the specific versions of packages you have installed using pip freeze

from m3inference.

zijwang avatar zijwang commented on August 23, 2024

@giulianoformisano could you try to verify whether you installed torch correctly? You could try import torch in Python console inside your virtualenv.

from m3inference.

giulianoformisano avatar giulianoformisano commented on August 23, 2024

Thanks a lot, unfortunately, I couldn't fix the issue following the instructions (https://pytorch.org/get-started/locally/).

Please find all my installed packages:

appnope==0.1.3
asttokens==2.2.1
backcall==0.2.0
certifi==2022.12.7
charset-normalizer==2.1.1
comm==0.1.2
contourpy==1.0.6
cycler==0.11.0
debugpy==1.6.4
decorator==5.1.1
entrypoints==0.4
executing==1.2.0
fonttools==4.38.0
idna==3.4
ipykernel==6.19.4
ipython==8.7.0
jedi==0.18.2
jupyter_client==7.4.8
jupyter_core==5.1.1
kiwisolver==1.4.4
m3inference==1.1.5
matplotlib==3.6.2
matplotlib-inline==0.1.6
nest-asyncio==1.5.6
numpy==1.24.1
packaging==22.0
pandas==1.5.2
parso==0.8.3
pexpect==4.8.0
pickleshare==0.7.5
Pillow==9.3.0
platformdirs==2.6.0
prompt-toolkit==3.0.36
psutil==5.9.4
ptyprocess==0.7.0
pure-eval==0.2.2
pycld2==0.41
Pygments==2.13.0
pyparsing==3.0.9
python-dateutil==2.8.2
pytz==2022.7
pyzmq==24.0.1
rauth==0.7.3
requests==2.28.1
seaborn==0.12.1
six==1.16.0
stack-data==0.6.2
torch==1.13.1
torchvision==0.14.1
tornado==6.2
tqdm==4.64.1
traitlets==5.8.0
typing_extensions==4.4.0
urllib3==1.26.13
wcwidth==0.2.5

from m3inference.

wdwgonzales avatar wdwgonzales commented on August 23, 2024

@computermacgyver @zijwang

Everything was working fine earlier this year. When I tried running the script this time around, I encountered the same problem as @giulianoformisano. Some context: I work with an M1 mac and torch seems to be working just fine for me.

(base) ➜  scripts python m3twitter.py --id=19854920 --auth auth.txt --skip-cache
Traceback (most recent call last):
  File "/Users/wdwg/Desktop/scripts/m3twitter.py", line 4, in <module>
    from m3inference import M3Twitter
  File "/Users/wdwg/opt/anaconda3/lib/python3.9/site-packages/m3inference/__init__.py", line 1, in <module>
    from .m3inference import M3Inference
  File "/Users/wdwg/opt/anaconda3/lib/python3.9/site-packages/m3inference/m3inference.py", line 14, in <module>
    from .full_model import M3InferenceModel
  File "/Users/wdwg/opt/anaconda3/lib/python3.9/site-packages/m3inference/full_model.py", line 11, in <module>
    class M3InferenceModel(nn.Module):
  File "/Users/wdwg/opt/anaconda3/lib/python3.9/site-packages/m3inference/full_model.py", line 12, in M3InferenceModel
    def __init__(self, device='cuda' if torch.cuda.is_available() else 'cpu'):
NameError: name 'torch' is not defined

I tried to fix it by adding import torch, torchvision in the downloaded scripts. I've also run the following command, to update pytorch:

conda install pytorch torchvision torchaudio -c pytorch-nightly

After that, I added map_location=torch.device('mps') as an argument wherever there is torch.load(...) in the scripts, specifically in the m3inference.py file. MPS was selected because of a recent update. I tried "cpu" but it didn't work.

Anyway, running the program with the above adjustments partially fixed the problem. Now, I am stuck again with a segmentation error.

(base) ➜  scripts python m3twitter.py --id=19854920 --auth auth.txt --skip-cache
12/31/2022 17:46:16 - INFO - m3inference.m3inference -   Version 1.1.5
12/31/2022 17:46:16 - INFO - m3inference.m3inference -   Running on cpu.
12/31/2022 17:46:16 - INFO - m3inference.m3inference -   Will use full M3 model.
12/31/2022 17:46:17 - INFO - m3inference.m3inference -   Model full_model exists at /Users/wdwg/m3/models/full_model.mdl.
12/31/2022 17:46:17 - INFO - m3inference.utils -   Checking MD5 for model full_model at /Users/wdwg/m3/models/full_model.mdl
12/31/2022 17:46:17 - INFO - m3inference.utils -   MD5s match.
12/31/2022 17:46:18 - INFO - m3inference.m3inference -   Loaded pretrained weight at /Users/wdwg/m3/models/full_model.mdl
/Users/wdwg/Desktop/scripts
12/31/2022 17:46:18 - INFO - m3inference.m3twitter -   skip_cache is True. Fetching data from Twitter for id 19854920.
12/31/2022 17:46:18 - INFO - m3inference.m3twitter -   GET /users/show.json?id=19854920
12/31/2022 17:46:18 - INFO - m3inference.dataset -   1 data entries loaded.
Predicting...:   0%|                                                    | 0/1 [00:00<?, ?it/s][1]    
27886 segmentation fault  python m3twitter.py --id=19854920 --auth auth.txt --skip-cache
(base) ➜  scripts /Users/wdwg/opt/anaconda3/lib/python3.9/multiprocessing/resource_tracker.py:216: UserWarning: resource_tracker: There appear to be 12 leaked semaphore objects to clean up at shutdown
  warnings.warn('resource_tracker: There appear to be %d '

I use this package heavily (and cite it whenever I can). It would be a shame for it not to work in future projects. :(

from m3inference.

wdwgonzales avatar wdwgonzales commented on August 23, 2024

Update:

I switched environments (switched to pytorch-nightly) using conda activate torch-nightly, and made the same modifications as detailed above, and it worked?

(torch-nightly) ➜  scripts python m3twitter.py --screen-name=barackobama --auth auth.txt
12/31/2022 18:13:52 - INFO - m3inference.m3inference -   Version 1.1.5
12/31/2022 18:13:52 - INFO - m3inference.m3inference -   Running on cpu.
12/31/2022 18:13:52 - INFO - m3inference.m3inference -   Will use full M3 model.
12/31/2022 18:13:53 - INFO - m3inference.m3inference -   Model full_model exists at /Users/wdwg/m3/models/full_model.mdl.
12/31/2022 18:13:53 - INFO - m3inference.utils -   Checking MD5 for model full_model at /Users/wdwg/m3/models/full_model.mdl
12/31/2022 18:13:53 - INFO - m3inference.utils -   MD5s match.
12/31/2022 18:13:54 - INFO - m3inference.m3inference -   Loaded pretrained weight at /Users/wdwg/m3/models/full_model.mdl
/Users/wdwg/Desktop/scripts
12/31/2022 18:13:54 - INFO - m3inference.m3twitter -   Results not in cache. Fetching data from Twitter for barackobama.
12/31/2022 18:13:54 - INFO - m3inference.m3twitter -   GET /users/show.json?screen_name=barackobama
12/31/2022 18:13:55 - INFO - m3inference.dataset -   1 data entries loaded.
Predicting...:   0%|                                                                                                      | 0/1 [00:00<?, ?it/s][W NNPACK.cpp:64] Could not initialize NNPACK! Reason: Unsupported hardware.
Predicting...: 100%|██████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:07<00:00,  7.58s/it]
{'input': {'description': 'Dad, husband, President, citizen.',
           'id': '813286',
           'img_path': '/Users/wdwg/m3/cache/813286_224x224.jpg',
           'lang': 'en',
           'name': 'Barack Obama',
           'screen_name': 'BarackObama'},
 'output': {'age': {'19-29': 0.0003,
                    '30-39': 0.0003,
                    '<=18': 0.0004,
                    '>=40': 0.9991},
            'gender': {'female': 0.0004, 'male': 0.9996},
            'org': {'is-org': 0.0046, 'non-org': 0.9954}}}

from m3inference.

computermacgyver avatar computermacgyver commented on August 23, 2024

Interesting. Thanks @wdwgonzales . It sounds like the segmentation fault from #26 is solved by using the latest (nightly) build and adding map_location=torch.device('mps') as an argument wherever there is torch.load(...). That is probably something we can do automatically if we detect the computer has an M1/arm64 architecture.

What computer architecture and OS are you using @giulianoformisano ? Also what version of Python?

from m3inference.

giulianoformisano avatar giulianoformisano commented on August 23, 2024

Thanks a lot for your inputs! I followed @wdwgonzales, but the procedure didn't fix the issue.

I am currently using an M2, python versions 3.9.6 and 3.8.8. I also tried the same procedure on Windows Server 2012, yet getting the same error.

from m3inference.

MuraliRamRavipati avatar MuraliRamRavipati commented on August 23, 2024

downgrading torch to 1.12.1 and torchvision to 0.13.1 worked for me

from m3inference.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.