Code Monkey home page Code Monkey logo

deepakpadhi986 / ai-resume-analyzer Goto Github PK

View Code? Open in Web Editor NEW
212.0 2.0 69.0 3.8 MB

Ai Resume Analyzer is a tool which parses information from a resume using natural language processing and finds the keywords, cluster them onto sectors based on their keywords. And lastly show recommendations, predictions, analytics to the applicant based on keyword matching.

License: MIT License

Python 100.00%
final-year-project resume-analyser resume-analysis ai-resume-analyser ai-resume-analyzer final-year-college-poject ai-resume-analyzer-github resume-analyses final-project final-year-project-idea

ai-resume-analyzer's Introduction

  • ๐Ÿ‘‹ Hi, Iโ€™m @deepakpadhi986
  • ๐Ÿ‘€ Iโ€™m interested in Web Development
  • ๐ŸŒฑ Iโ€™m currently learning Adv technology of web apps
  • ๐Ÿ’ž๏ธ Iโ€™m looking to collaborate on projects based on web

ai-resume-analyzer's People

Contributors

deepakpadhi986 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

ai-resume-analyzer's Issues

whats this error bro and explain it briefly

ValueError: [E1005] Unable to set attribute 'POS' in tokenizer exception for ' '. Tokenizer exceptions are only allowed to specify ORTH and NORM.

Traceback:
File "D:\pro\AI-Resume-Analyzer-main\venvapp\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 535, in run_script
exec(code, module.dict)
File "D:\pro\AI-Resume-Analyzer-main\App\App.py", line 804, in
run()
File "D:\pro\AI-Resume-Analyzer-main\App\App.py", line 269, in run
resume_data = ResumeParser(save_image_path).get_extracted_data()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\pro\AI-Resume-Analyzer-main\venvapp\Lib\site-packages\pyresparser\resume_parser.py", line 21, in init
custom_nlp = spacy.load(os.path.dirname(os.path.abspath(file)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\pro\AI-Resume-Analyzer-main\venvapp\Lib\site-packages\spacy_init
.py", line 51, in load
return util.load_model(
^^^^^^^^^^^^^^^^
File "D:\pro\AI-Resume-Analyzer-main\venvapp\Lib\site-packages\spacy\util.py", line 467, in load_model
return load_model_from_path(Path(name), **kwargs) # type: ignore[arg-type]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\pro\AI-Resume-Analyzer-main\venvapp\Lib\site-packages\spacy\util.py", line 547, in load_model_from_path
return nlp.from_disk(model_path, exclude=exclude, overrides=overrides)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\pro\AI-Resume-Analyzer-main\venvapp\Lib\site-packages\spacy\language.py", line 2184, in from_disk
util.from_disk(path, deserializers, exclude) # type: ignore[arg-type]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\pro\AI-Resume-Analyzer-main\venvapp\Lib\site-packages\spacy\util.py", line 1372, in from_disk
reader(path / key)
File "D:\pro\AI-Resume-Analyzer-main\venvapp\Lib\site-packages\spacy\language.py", line 2170, in
deserializers["tokenizer"] = lambda p: self.tokenizer.from_disk( # type: ignore[union-attr]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "spacy\tokenizer.pyx", line 771, in spacy.tokenizer.Tokenizer.from_disk
File "spacy\tokenizer.pyx", line 839, in spacy.tokenizer.Tokenizer.from_bytes
File "spacy\tokenizer.pyx", line 123, in spacy.tokenizer.Tokenizer.rules.set
File "spacy\tokenizer.pyx", line 571, in spacy.tokenizer.Tokenizer._load_special_cases
File "spacy\tokenizer.pyx", line 601, in spacy.tokenizer.Tokenizer.add_special_case
File "spacy\tokenizer.pyx", line 589, in spacy.tokenizer.Tokenizer._validate_sp

Dockerize project

Hello, I am interested in trying your project. Do you have a docker file for your project? If not would you accept a dockerfile as a contribution?

Request for Attribution/Reference for Smart-Resume-Analyzer Project[Original source of this Project]

Hello Deepak,

Approximately two years ago (on March 6, 2022), I created the Smart-Resume-Analyzer project. I uploaded it to GitHub and also created a tutorial video on YouTube, which comprehensively explains the project. Recently, some subscribers have reached out to me, expressing concerns that you may be using my repository code or projects and distributing them commercially.

I want to clarify that I don't have any issues with this usage. However, what I do expect is proper credit given to me. Upon reviewing your code, I've noticed few codes and scripts, everything that I can see is similar to my repository, and I can see that you created this repository around at 26th September, 2022).

I hope you can empathize with my perspective on this matter. I don't mind if you are profiting from and selling my work, but it's essential to provide some acknowledgment to me. Several subscribers have contacted me previously about this, but I chose to overlook it. However, upon learning that you are using it commercially and selling it without crediting me, I felt compelled to reach out.

Please feel free to get in touch with me. I'm not here to engage in a dispute with you but simply to request that you respect the open-source community.

Regards,
Kushal Bhavsar

'This error originates from a subprocess, and is likely not a problem with pip'

hi,
i tried to clone your project AI-Resume-Analyzer. Unfortunately an error occured while installing requirements.txt

`Traceback (most recent call last):
File "D:\AI-Resume-Analyzer\venvapp\Lib\site-packages\pip_vendor\pyproject_hooks_in_process_in_process.py", line 353, in
main()
File "D:\AI-Resume-Analyzer\venvapp\Lib\site-packages\pip_vendor\pyproject_hooks_in_process_in_process.py", line 335, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\AI-Resume-Analyzer\venvapp\Lib\site-packages\pip_vendor\pyproject_hooks_in_process_in_process.py", line 118, in get_requires_for_build_wheel
return hook(config_settings)
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\suyeon\AppData\Local\Temp\pip-build-env-14xqsx9t\overlay\Lib\site-packages\setuptools\build_meta.py", line 355, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=['wheel'])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\suyeon\AppData\Local\Temp\pip-build-env-14xqsx9t\overlay\Lib\site-packages\setuptools\build_meta.py", line 325, in _get_build_requires
self.run_setup()
File "C:\Users\suyeon\AppData\Local\Temp\pip-build-env-14xqsx9t\overlay\Lib\site-packages\setuptools\build_meta.py", line 341, in run_setup
exec(code, locals())
File "", line 256, in
File "", line 195, in setup_package
File "C:\Users\suyeon\AppData\Local\Temp\pip-build-env-14xqsx9t\overlay\Lib\site-packages\Cython\Build\Dependencies.py", line 1154, in cythonize
cythonize_one(*args)
File "C:\Users\suyeon\AppData\Local\Temp\pip-build-env-14xqsx9t\overlay\Lib\site-packages\Cython\Build\Dependencies.py", line 1321, in cythonize_one
raise CompileError(None, pyx_file)
Cython.Compiler.Errors.CompileError: thinc/extra/eg.pyx
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

ร— Getting requirements to build wheel did not run successfully.
โ”‚ exit code: 1
โ•ฐโ”€> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.`

I don't know if my settings are problem.
Could you possibly help me with this problem?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.