Code Monkey home page Code Monkey logo

ibrahimjelliti / deeplearning.ai-natural-language-processing-specialization Goto Github PK

View Code? Open in Web Editor NEW
702.0 702.0 493.0 237.56 MB

This repository contains my full work and notes on Coursera's NLP Specialization (Natural Language Processing) taught by the instructor Younes Bensouda Mourri and Łukasz Kaiser offered by deeplearning.ai

Home Page: https://www.coursera.org/specializations/natural-language-processing

License: GNU General Public License v3.0

Jupyter Notebook 98.04% Python 1.46% TeX 0.01% Roff 0.50%
attention-mechanism coursera deep-learning deeplearning-ai encoder-decoder logistic-regression machine-learning naive-bayes neural neural-networks nlp probabilistic-models sequence-models specialization

deeplearning.ai-natural-language-processing-specialization's Introduction

deeplearning.ai-natural-language-processing-specialization's People

Contributors

farhan-najeeb avatar ibrahimjelliti avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

deeplearning.ai-natural-language-processing-specialization's Issues

C2_W2_Assignment: Part of Speech Tagging, accurancy is not match

In ,the Accuracy of the Viterbi algorithm is 0.9528 while the expected is 0.9531
despite the difference is small 0.0003, the autograder consider it wrong accuracy.

image

output from the autograder:

Code Cell UNQ_C8: Function 'compute_accuracy' is incorrect. Check implementation.

Can you upload NLP_C3_W1_lecture_notebook_introduction_to_trax.ipynb and NLP_C3_W2_lecture_notebook_Hidden_State_Activation.ipynb

Can you upload the lecture notebook?
NLP_C3_W1_lecture_notebook_introduction_to_trax.ipynb
NLP_C3_W2_lecture_notebook_Hidden_State_Activation.ipynb
NLP_C3_W2_lecture_notebook_Working_with_Jax_Numpy_and_Calculating_Perplexity.ipynb
NLP_C3_W2_lecture_notebook_Creating_a_GRU_model_Using_Trax.ipynb
NLP_C3_W3_lecture_notebook_Vanishing_Gradient.ipynb
NLP_C3_W4_lecture_notebook_Modified_Triplet_Loss.ipynb
NLP_C3_W4_lecture_notebook_Create_a_Siamese_Model_using_Trax.ipynb
NLP_C3_W4_lecture_notebook_Evaluate_a_Siamese_Model.ipynb

Do you know the reason?

In course 3 week 2 assignment,
data_generator = data_generator(batch_size, v_sentences, v_labels, vocab[''], True)
but in course 3 week 3 assignment,
data_generator = trax.supervised.inputs.add_loss_weights(
data_generator(batch_size, v_sentences, v_labels, vocab[''], True),
id_to_mask=vocab[''])
Why the data_generator didn't mask padding in the loss weights of data? @ijelliti

confidence_ellipse is missing

Shouldn't be there a function confidence_ellipse in the utils.py in the lab for Course1, week2?

The line "from utils import confidence_ellipse" would return an error otherwise

Chatbot using Reformer network in different language.

Is it possible to use this code tot rain and chatbot in a different language?
Or are you using a pre-trained model that is pre-trained in English?

What changes are required to make it work in the German language?

Course 2 week 4 ~ Wrong back propagation function ~ UNQ_C4

On the 2nd course of the specialization on the assignment of the week 4 there is a mistake on the back-propagation calculation on the section 'UNQ_C4'

You calculate the gradient of biases wrong. You should change it as follows:

grad_b1 = (1/batch_size)*np.dot(l1,ones_array.T).reshape(-1,1)

grad_b2 = (1/batch_size)*np.dot(yhat-y,ones_array.T).reshape(-1,1)

where

ones_array = np.ones(batch_size)

With that way it is the same as the slides in the course. You can check your own notes too on this github repository of how gradient of bias should be calculated.

add course notes of week 2 from Course 1

following week 1 course notes, we continue to add the same to the next weeks of courses.
this issue is to continue share notes on naive Bayes from Week 2 Course1.

add 'logistic_features.csv'

In 1 - Natural Language Processing with Classification and Vector Spaces - Labs - Week1 - NLP_C1_W1_lecture_nb_03.ipynb.

There is no 'logistic_features.csv'.

utils.py not included

Hi First, thank you very much for sharing your solutions.

I noticed in some notebooks we need to call functions defined in the module/script called utils. I am wondering if we can find that script anywhere in the repo? It seems it can not be found if we run the notebook on a local machine.

Thanks again!

Missing .py file

Describe the bug
In NLP_C1_W4_lecture_nb_01 there is no utils_nb.py

Expected behavior
add the utils_nb.py

Improvisation in Course 2/Lab/Week4/Assignment - Backprop function

The gradient for biases defined does not work
grad_b1 = np.sum((1/batch_size)*np.dot(l1,x.T),axis=1,keepdims=True)
grad_b2 = np.sum((1/batch_size)*np.dot(yhat-y,h.T),axis=1,keepdims=True)

My suggestion to correct it:
grad_b1 = np.dot(l1,np.ones((batch_size,1)))/batch_size
grad_b2 = np.dot(yhat-y,np.ones((batch_size,1)))/batch_size

It is a similar thing to do but it gives a better understanding as well as, works for me!

utlis.py error

In C1_W2_Assignment , the utlis.py doesn't have the function lookup.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.