Code Monkey home page Code Monkey logo

Comments (9)

srbhr avatar srbhr commented on May 25, 2024 1

Thanks @samascience and @BThomas22tech for raising the issues.

I'm trying to fix these issues please give me sometime.

from resume-matcher.

srbhr avatar srbhr commented on May 25, 2024 1

Okay I need to drop Qdrant and Cohere from the requirements.txt and try again.

from resume-matcher.

BThomas22tech avatar BThomas22tech commented on May 25, 2024

I am receiving this error too. I've tried updating all the dependencies in the text file to the latest versions but still got dependency errors, but there were only a few errors so I took the version requirements off of the dependencies. This got me a little further in the process, but now I have a different kind of error which reads like this:

Installing build dependencies ... error
  error: subprocess-exited-with-error

  × pip subprocess to install build dependencies did not run successfully.
  │ exit code: 1
  ╰─> [64 lines of output]
      Collecting setuptools
        Obtaining dependency information for setuptools from https://files.pythonhosted.org/packages/95/79/6b47c6a872b40743a480687dc0c79ffb4202710789f3e4d54a84fff8b550/setuptools-68.2.1-py3-none-any.whl.metadata
        Using cached setuptools-68.2.1-py3-none-any.whl.metadata (6.3 kB)
      Collecting cython<3.0,>=0.25
        Obtaining dependency information for cython<3.0,>=0.25 from https://files.pythonhosted.org/packages/3f/d6/9eed523aeaca42acbaa3e6d3850edae780dc7f8da9df1bf6a2ceb851839c/Cython-0.29.36-py2.py3-none-any.whl.metadata
        Using cached Cython-0.29.36-py2.py3-none-any.whl.metadata (3.1 kB)
      Collecting murmurhash<1.1.0,>=1.0.2
        Using cached murmurhash-1.0.9-cp311-cp311-win_amd64.whl (18 kB)
      Collecting cymem<2.1.0,>=2.0.2
        Using cached cymem-2.0.7-cp311-cp311-win_amd64.whl (28 kB)
      Collecting preshed<3.1.0,>=3.0.2
        Using cached preshed-3.0.8-cp311-cp311-win_amd64.whl (91 kB)
      Collecting blis<0.10.0,>=0.7.8
        Using cached blis-0.9.1.tar.gz (3.6 MB)
        Installing build dependencies: started
        Installing build dependencies: finished with status 'done'
        Getting requirements to build wheel: started
        Getting requirements to build wheel: finished with status 'done'
        Installing backend dependencies: started
        Installing backend dependencies: finished with status 'done'
        Preparing metadata (pyproject.toml): started
        Preparing metadata (pyproject.toml): finished with status 'done'
      Collecting numpy>=1.15.0
        Obtaining dependency information for numpy>=1.15.0 from https://files.pythonhosted.org/packages/72/b2/02770e60c4e2f7e158d923ab0dea4e9f146a2dbf267fec6d8dc61d475689/numpy-1.25.2-cp311-cp311-win_amd64.whl.metadata
        Using cached numpy-1.25.2-cp311-cp311-win_amd64.whl.metadata (5.7 kB)
      Using cached setuptools-68.2.1-py3-none-any.whl (807 kB)
      Using cached Cython-0.29.36-py2.py3-none-any.whl (988 kB)
      Using cached numpy-1.25.2-cp311-cp311-win_amd64.whl (15.5 MB)
      Building wheels for collected packages: blis
        Building wheel for blis (pyproject.toml): started
        Building wheel for blis (pyproject.toml): finished with status 'error'
        error: subprocess-exited-with-error
     
        Building wheel for blis (pyproject.toml) did not run successfully.
        exit code: 1
     
        [21 lines of output]
        BLIS_COMPILER? None
        running bdist_wheel
        running build
        running build_py
        creating build
        creating build\lib.win-amd64-cpython-311
        creating build\lib.win-amd64-cpython-311\blis
        copying blis\about.py -> build\lib.win-amd64-cpython-311\blis
        copying blis\benchmark.py -> build\lib.win-amd64-cpython-311\blis
        copying blis\__init__.py -> build\lib.win-amd64-cpython-311\blis
        creating build\lib.win-amd64-cpython-311\blis\tests
        copying blis\tests\common.py -> build\lib.win-amd64-cpython-311\blis\tests
        copying blis\tests\test_dotv.py -> build\lib.win-amd64-cpython-311\blis\tests
        copying blis\tests\test_gemm.py -> build\lib.win-amd64-cpython-311\blis\tests
        copying blis\tests\__init__.py -> build\lib.win-amd64-cpython-311\blis\tests
        copying blis\cy.pyx -> build\lib.win-amd64-cpython-311\blis
        copying blis\py.pyx -> build\lib.win-amd64-cpython-311\blis
        copying blis\cy.pxd -> build\lib.win-amd64-cpython-311\blis
        copying blis\__init__.pxd -> build\lib.win-amd64-cpython-311\blis
        running build_ext
        error: Microsoft Visual C++ 14.0 or greater is required. Get it with "Microsoft C++ Build Tools": https://visualstudio.microsoft.com/visual-cpp-build-tools/
        [end of output]

        note: This error originates from a subprocess, and is likely not a problem with pip.
        ERROR: Failed building wheel for blis
      Failed to build blis
      ERROR: Could not build wheels for blis, which is required to install pyproject.toml-based projects
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× pip subprocess to install build dependencies did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

So I guess I need to download this build tool and try again?

from resume-matcher.

wmxace avatar wmxace commented on May 25, 2024

Hi, I also tried it and am getting this error:

#0 24.38 qdrant-client 1.3.2 depends on urllib3<2.0.0 and >=1.26.14
#0 24.38 The user requested typing_extensions==4.7.1
#0 24.38 pydantic 1.10.11 depends on typing-extensions>=4.2.0
#0 24.38 streamlit 1.24.1 depends on typing-extensions<5 and >=4.0.1
#0 24.38 typer 0.9.0 depends on typing-extensions>=3.7.4.3
#0 24.38 qdrant-client 1.3.1 depends on typing-extensions<4.6.0 and >=4.0.0
#0 24.38 The user requested typing_extensions==4.7.1
#0 24.38 pydantic 1.10.11 depends on typing-extensions>=4.2.0
#0 24.38 streamlit 1.24.1 depends on typing-extensions<5 and >=4.0.1
#0 24.38 typer 0.9.0 depends on typing-extensions>=3.7.4.3
#0 24.38 qdrant-client 1.3.0 depends on typing-extensions<4.6.0 and >=4.0.0
#0 24.38 The user requested typing_extensions==4.7.1
#0 24.38 pydantic 1.10.11 depends on typing-extensions>=4.2.0
#0 24.38 streamlit 1.24.1 depends on typing-extensions<5 and >=4.0.1
#0 24.38 typer 0.9.0 depends on typing-extensions>=3.7.4.3
#0 24.38 qdrant-client 1.2.0 depends on typing-extensions<4.6.0 and >=4.0.0
#0 24.38
#0 24.38 To fix this you could try to:
#0 24.38 1. loosen the range of package versions you've specified
#0 24.38 2. remove package versions to allow pip attempt to solve the dependency conflict
#0 24.38
#0 24.38 ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts

build.dockerfile:7

5 | RUN apt-get install -y build-essential python-dev git
6 | RUN pip install -U pip setuptools wheel
7 | >>> RUN pip install -r requirements.txt
8 | RUN python run_first.py
9 | ENTRYPOINT [ "streamlit", "run", "streamlit_app.py"]

ERROR: failed to solve: process "/bin/sh -c pip install -r requirements.txt" did not complete successfully: exit code: 1
ERROR: Service 'resume-matcher' failed to build : Build failed

This an ls output of the directory I am running the command "docker-compose build --no-cache"
-rw-rw-r-- 1 ali ali 13483 Sep 12 06:53 streamlit_second.py
-rw-rw-r-- 1 ali ali 10664 Sep 12 06:53 streamlit_app.py
-rw-rw-r-- 1 ali ali 2726 Sep 12 06:53 run_first.py
-rw-rw-r-- 1 ali ali 2492 Sep 12 06:53 requirements.txt
-rw-r--r-- 1 root root 159 Sep 12 07:09 docker-compose.yml
-rw-r--r-- 1 root root 301 Sep 12 07:11 build.dockerfile

from resume-matcher.

samascience avatar samascience commented on May 25, 2024

when the Qdrant and Cohere weer dropped I am getting this error !!

File "/usr/local/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 552, in _run_script
exec(code, module.dict)
File "/data/Resume-Matcher/streamlit_app.py", line 15, in
from scripts.similarity import get_similarity_score, find_path, read_config
File "/data/Resume-Matcher/scripts/similarity/init.py", line 1, in
from .get_similarity_score import get_similarity_score, find_path, read_config
File "/data/Resume-Matcher/scripts/similarity/get_similarity_score.py", line 5, in
import cohere

from resume-matcher.

samascience avatar samascience commented on May 25, 2024

Removing intermediate container 6bfc16cf3fc6
---> 75cfc28a6a43
Step 8/10 : RUN python run_first.py
---> Running in 082e908b9550
2023-09-13 05:04:10,800 (run_first.py:31) - INFO: Started to read from Data/Resumes
2023-09-13 05:04:10,800 (run_first.py:28) - INFO: Deleted old files from Data/Processed/Resumes
2023-09-13 05:04:10,801 (run_first.py:38) - INFO: Reading from Data/Resumes is now complete.
2023-09-13 05:04:10,801 (run_first.py:48) - INFO: Started parsing the resumes.
2023-09-13 05:04:13,345 (core.py:65) - INFO: loaded 'en_core_web_md' spaCy language pipeline
2023-09-13 05:04:18,117 (run_first.py:52) - INFO: Parsing of the resumes is now complete.
2023-09-13 05:04:18,117 (run_first.py:54) - INFO: Started to read from Data/JobDescription
2023-09-13 05:04:18,118 (run_first.py:28) - INFO: Deleted old files from Data/Processed/JobDescription
2023-09-13 05:04:18,118 (run_first.py:61) - INFO: Reading from Data/JobDescription is now complete.
2023-09-13 05:04:18,118 (run_first.py:72) - INFO: Started parsing the Job Descriptions.
2023-09-13 05:04:22,613 (run_first.py:76) - INFO: Parsing of the Job Descriptions is now complete.
2023-09-13 05:04:22,613 (run_first.py:77) - INFO: Success now run streamlit run streamlit_second.py
Removing intermediate container 082e908b9550
---> df09d358d228
Step 9/10 : ENTRYPOINT [ "streamlit", "run", "streamlit_app.py"]
---> Running in 45fc219d5388
Removing intermediate container 45fc219d5388
---> 284f2de380c8
Step 10/10 : EXPOSE 8501
---> Running in 16e93fe7e943
Removing intermediate container 16e93fe7e943
---> df6ef044b1d1

Successfully built df6ef044b1d1
Successfully tagged resume-matcher:latest
WARNING: Image for service resume-matcher was built because it did not already exist. To rebuild this image you must use docker-compose build or docker-compose up --build.
Creating resume-matcher_resume-matcher_1 ... done
Attaching to resume-matcher_resume-matcher_1
resume-matcher_1 |
resume-matcher_1 | Collecting usage statistics. To deactivate, set browser.gatherUsageStats to False.
resume-matcher_1 |
resume-matcher_1 |
resume-matcher_1 | You can now view your Streamlit app in your browser.
resume-matcher_1 |
resume-matcher_1 | Network URL: http://172.19.0.2:8501
resume-matcher_1 | External URL: http://73.77.103.176:8501
resume-matcher_1 |

resume-matcher_1 | 2023-09-13 05:05:56.039 Uncaught app exception
resume-matcher_1 | Traceback (most recent call last):
resume-matcher_1 | File "/usr/local/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 552, in _run_script
resume-matcher_1 | exec(code, module.dict)
resume-matcher_1 | File "/data/Resume-Matcher/streamlit_app.py", line 15, in
resume-matcher_1 | from scripts.similarity import get_similarity_score, find_path, read_config
resume-matcher_1 | File "/data/Resume-Matcher/scripts/similarity/init.py", line 1, in
resume-matcher_1 | from .get_similarity_score import get_similarity_score, find_path, read_config
resume-matcher_1 | File "/data/Resume-Matcher/scripts/similarity/get_similarity_score.py", line 5, in
resume-matcher_1 | import cohere
resume-matcher_1 | ModuleNotFoundError: No module named 'cohere'

from resume-matcher.

srbhr avatar srbhr commented on May 25, 2024

@samascience please do not run streamlit second,

from resume-matcher.

samascience avatar samascience commented on May 25, 2024

got it, changed it and now it is running fine thanks.

2023-09-13 13:22:05,895 (run_first.py:77) - INFO: Success now run streamlit run streamlit_second.py
root@ae55ecd95382:/data/Resume-Matcher# streamlit run streamlit_second.py

Collecting usage statistics. To deactivate, set browser.gatherUsageStats to False.

You can now view your Streamlit app in your browser.

Network URL: http://10.0.0.12:8502
External URL: http://73.77.103.176:8502

[nltk_data] Downloading package punkt to /root/nltk_data...
[nltk_data] Unzipping tokenizers/punkt.zip.
[nltk_data] Downloading package punkt to /root/nltk_data...
[nltk_data] Package punkt is already up-to-date!
[nltk_data] Downloading package punkt to /root/nltk_data...
[nltk_data] Package punkt is already up-to-date!
[nltk_data] Downloading package punkt to /root/nltk_data...
[nltk_data] Package punkt is already up-to-date!
[nltk_data] Downloading package punkt to /root/nltk_data...
[nltk_data] Package punkt is already up-to-date!
[nltk_data] Downloading package punkt to /root/nltk_data...
[nltk_data] Package punkt is already up-to-date!
[nltk_data] Downloading package punkt to /root/nltk_data...
[nltk_data] Package punkt is already up-to-date!
[nltk_data] Downloading package punkt to /root/nltk_data...
[nltk_data] Package punkt is already up-to-date!
[nltk_data] Downloading package punkt to /root/nltk_data...
[nltk_data] Package punkt is already up-to-date!

from resume-matcher.

srbhr avatar srbhr commented on May 25, 2024

That's great @samascience! you're welcome!

from resume-matcher.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.