- Uncategorized
-
Week1
-
1. Welcome to the NLP Specialization.mp4
-
2. Welcome to Course 1 .mp4
-
3. Supervised ML _ Sentiment Analysis .mp4
-
4. Vocabulary _ Feature Extraction .mp4
-
5. Negative and Positive Frequencies .mp4
-
6. Feature Extraction with Frequencies .mp4
-
7. Preprocessing .mp4
-
8. Putting it All Together.mp4
-
9. Logistic Regression Overview.mp4
-
10. Logistic Regression- Training .mp4
-
11. Logistic Regression- Testing.mp4
-
12. Logistic Regression- Cost Function .mp4
-
Acknowledgement - Ken Church
-
Acknowledgement - Ken Church _ Coursera.pdf
-
Andrew Ng with Chris Manning.mp4
-
-
Reading
-
1. Supervised ML _ Sentiment Analysis.pdf
-
2. Vocabulary _ Feature Extraction.pdf
-
3. Feature Extraction with Frequencies.pdf
-
4. Preprocessing.pdf
-
5. Putting it all together.pdf
-
6. Logistic Regression Overview.pdf
-
7. Logistic Regression Overview.pdf
-
8. Logistic Regression_ Training.pdf
-
9. Logistic Regression_ Testing.pdf
-
10. Optional Logistic Regression_ Cost Function.pdf
-
11. Optional Logistic Regression_ Gradient.pdf
-
-
Week2
-
1. Probability and Bayes’ Rule.mp4
-
2. Bayes’ Rule.mp4
-
3. Naïve Bayes Introduction .mp4
-
4. Laplacian Smoothing .mp4
-
5. Log Likelihood, Part 1 .mp4
-
6. Log Likelihood, Part 2 .mp4
-
7. Training Naïve Bayes .mp4
-
8. Testing Naïve Bayes .mp4
-
9. Applications of Naïve Bayes .mp4
-
10. Naïve Bayes Assumptions.mp4
-
11. Error Analysis .mp4
-
- Reading
- Week3
- NLP_C1_W3
- Week4
- Week1
- C2_W1_Assignment
- NLP_C2_W1
-
Week2
-
1. Part of Speech Tagging.mp4
-
2. Markov Chains.mp4
-
3. Markov Chains and POS Tags.mp4
-
4. Hidden Markov Models.mp4
-
5. Calculating Probabilities.mp4
-
6. Populating the Transition Matrix.mp4
-
7. Populating the Emission Matrix.mp4
-
8. The Viterbi Algorithm.mp4
-
9. Viterbi- Initialization.mp4
-
10. Viterbi- Forward Pass.mp4
-
11. Viterbi- Backward Pass.mp4
-
- NLP_C2_W2
- Week3
- C2_W3_Assignment
- NLP_C2_W3
-
Week4
-
1. Overview.mp4
-
2. Basic Word Representations.mp4
-
3. Word Embeddings.mp4
-
4. How to Create Word Embeddings .mp4
-
5. Word Embedding Methods .mp4
-
6. Continuous Bag-of-Words Model .mp4
-
7. Cleaning and Tokenization .mp4
-
8. Sliding Window of Words in Python.mp4
-
9. Transforming Words into Vectors .mp4
-
10. Architecture of the CBOW Model .mp4
-
11. Architecture of the CBOW Model- Dimensions.mp4
-
12. Architecture of the CBOW Model- Dimensions 2 .mp4
-
13. Architecture of the CBOW Model- Activation Functions .mp4
-
14. Training a CBOW Model- Cost Function .mp4
-
15. Training a CBOW Model- Cost Function .mp4
-
16. Training a CBOW Model- Backpropagation and Gradient Descent .mp4
-
17. Extracting Word Embedding Vectors .mp4
-
18. Evaluating Word Embeddings- Intrinsic Evaluation .mp4
-
19. Evaluating Word Embeddings- Extrinsic Evaluation .mp4
-
20. Conclusion .mp4
-
- tokenizers
- Week1
- NLP_C3
- Week2
- C3_W2
- C3_W2_Assignment
- model
- Week3
- C3_W3
- eval
- large
- small
- train
- model
- Reading
- Week4
- C3_W4
- C3_W4_Assignment
- model
- wordnet
-
universal_tagset
-
ar-padt.map
-
bg-btb.map
-
ca-cat3lb.map
-
cs-pdt.map
-
da-ddt.map
-
de-negra.map
-
de-tiger.map
-
el-gdt.map
-
en-brown.map
-
en-ptb.map
-
en-tweet.map
-
en-tweet.README
-
es-cast3lb.map
-
es-eagles.map
-
es-iula.map
-
es-treetagger.map
-
eu-eus3lb.map
-
fi-tdt.map
-
fr-paris.map
-
hu-szeged.map
-
it-isst.map
-
iw-mila.map
-
ja-kyoto.map
-
ja-verbmobil.map
-
ko-sejong.map
-
nl-alpino.map
-
pl-ipipan.map
-
pt-bosque.map
-
README
-
ru-rnc.map
-
sl-sdt.map
-
sv-talbanken.map
-
tu-metusbanci.map
-
universal_tags.py
-
zh-ctb6.map
-
zh-sinica.map
-
- punkt
-
PY3
-
czech.pickle
-
danish.pickle
-
dutch.pickle
-
english.pickle
-
estonian.pickle
-
finnish.pickle
-
french.pickle
-
german.pickle
-
greek.pickle
-
italian.pickle
-
norwegian.pickle
-
polish.pickle
-
portuguese.pickle
-
README
-
russian.pickle
-
slovene.pickle
-
spanish.pickle
-
swedish.pickle
-
turkish.pickle
-
README
-
russian.pickle
-
slovene.pickle
-
spanish.pickle
-
swedish.pickle
-
turkish.pickle
-
punkt.zip
-
questions.csv
-
siamese.png
-
- 4. NLP with Attention Models
- Week1
- C4_W1
- data
- 0.1.0
- Reading
- Week2
-
attention_lnb_figs
-
C4_W2_L3_dot-product-attention_S01_introducing-attention_stripped.png
-
C4_W2_L3_dot-product-attention_S01_introducing-attention.png
-
C4_W2_L3_dot-product-attention_S02_queries-keys-and-values_stripped.png
-
C4_W2_L3_dot-product-attention_S02_queries-keys-and-values.png
-
C4_W2_L3_dot-product-attention_S03_concept-of-attention_stripped.png
-
C4_W2_L3_dot-product-attention_S03_concept-of-attention.png
-
C4_W2_L3_dot-product-attention_S04_attention-math_stripped.png
-
C4_W2_L3_dot-product-attention_S04_attention-math.png
-
C4_W2_L3_dot-product-attention_S05_attention-formula_stripped.png
-
C4_W2_L3_dot-product-attention_S05_attention-formula.png
-
C4_W2_L4_causal-attention_S01_three-ways-of-attention.png
-
C4_W2_L4_causal-attention_S02_causal-attention_stripped.png
-
C4_W2_L4_causal-attention_S02_causal-attention.png
-
C4_W2_L4_causal-attention_S03_causal-attention-math_stripped.png
-
C4_W2_L4_causal-attention_S03_causal-attention-math.png
-
C4_W2_L4_causal-attention_S04_causal-attention-math-2_stripped.png
-
C4_W2_L4_causal-attention_S04_causal-attention-math-2.png
-
C4_W2_L5_multi-head-attention_S01_multi-head-attention_stripped.png
-
C4_W2_L5_multi-head-attention_S01_multi-head-attention.png
-
C4_W2_L5_multi-head-attention_S02_multi-head-attention-2.png
-
C4_W2_L5_multi-head-attention_S03_multi-head-attention-3.png
-
C4_W2_L5_multi-head-attention_S03_multi-head-attention-math_stripped.png
-
C4_W2_L5_multi-head-attention_S04_multi-head-attention-overview_stripped.png
-
C4_W2_L5_multi-head-attention_S04_multi-head-attention-overview.png
-
C4_W2_L5_multi-head-attention_S05_multi-head-attention-concatenation_stripped.png
-
C4_W2_L5_multi-head-attention_S05_multi-head-attention-concatenation.png
-
C4_W2_L5_multi-head-attention_S06_multi-head-attention-scaled-dot-product_stripped.png
-
C4_W2_L5_multi-head-attention_S06_multi-head-attention-scaled-dot-product.png
-
C4_W2_L5_multi-head-attention_S07_multi-head-attention-formula_stripped.png
-
C4_W2_L5_multi-head-attention_S07_multi-head-attention-formula.png
-
-
SVG
-
C4_W2_L3_dot-product-attention_S01_introducing-attention.svg
-
C4_W2_L3_dot-product-attention_S02_queries-keys-and-values.svg
-
C4_W2_L3_dot-product-attention_S03_concept-of-attention.svg
-
C4_W2_L3_dot-product-attention_S04_attention-math.svg
-
C4_W2_L3_dot-product-attention_S05_attention-formula.svg
-
C4_W2_L4_causal-attention_S01_three-ways-of-attention.svg
-
C4_W2_L4_causal-attention_S02_causal-attention.svg
-
C4_W2_L4_causal-attention_S03_causal-attention-math.svg
-
C4_W2_L4_causal-attention_S04_causal-attention-math-2.svg
-
C4_W2_L5_multi-head-attention_S01_multi-head-attention.svg
-
C4_W2_L5_multi-head-attention_S02_multi-head-attention-2.svg
-
C4_W2_L5_multi-head-attention_S03_multi-head-attention-3.svg
-
C4_W2_L5_multi-head-attention_S04_multi-head-attention-overview.svg
-
C4_W2_L5_multi-head-attention_S05_multi-head-attention-concatenation.svg
-
C4_W2_L5_multi-head-attention_S06_multi-head-attention-scaled-dot-product.svg
-
C4_W2_L5_multi-head-attention_S07_multi-head-attention-formula.svg
-
C4_W2_lecture_notebook_Attention.ipynb
-
C4_W2_lecture_notebook_Transformer_Decoder.ipynb
-
- data
- 1.2.0
- transformer_decoder_lnb_figs
- C4_W2_Assignment
-
3.0.0
-
cnn_dailymail-test.tfrecord-00000-of-00001
-
cnn_dailymail-train.tfrecord-00005-of-00016
-
cnn_dailymail-train.tfrecord-00007-of-00016
-
cnn_dailymail-train.tfrecord-00008-of-00016
-
cnn_dailymail-train.tfrecord-00014-of-00016
-
cnn_dailymail-validation.tfrecord-00000-of-00001
-
dataset_info.json
-
decoder.png
-
dotproduct.png
-
masked-attention.png
-
model.pkl.gz
-
transformer_decoder_1.png
-
transformer_decoder_zoomin.png
-
transformer_decoder.png
-
transformer.png
-
transformerNews.png
-
Untitled.ipynb
-
- vocab_dir
- Reading
-
Week3
-
1. Week 3 Overview.mp4
-
2. Transfer Learning in NLP .mp4
-
3. ELMo, GPT, BERT, T5 .mp4
-
4. Bidirectional Encoder Representations from Transformers (BERT) .mp4
-
5. BERT Objective .mp4
-
6. Fine tuning BERT .mp4
-
7. Transformer- T5 .mp4
-
8. Multi-Task Training Strategy .mp4
-
9. GLUE Benchmark .mp4
-
10. Question Answering .mp4
-
- C4_W3_Assignment
- 1.0.0
- C4W3
-
Reading
-
1. Week 3 Overview.pdf
-
2. Transfer Learning in NLP.pdf
-
3. ELMo, GPT, BERT, T5.pdf
-
4. Bidirectional Encoder Representations from Transformers (BERT).pdf
-
5. BERT Objective.pdf
-
6. Fine tuning BERT.pdf
-
7. Transformer T5.pdf
-
8. Multi-Task Training Strategy.pdf
-
9. GLUE Benchmark.pdf
-
10. Question Answering.pdf
-
11. References.pdf
-
- Week4
- data
- vocabs
-
C4W4
-
branch1.PNG
-
branch2.PNG
-
branch3.PNG
-
C4_W4_Ungraded_Lab_Reformer_LSH.ipynb
-
C4_W4_Ungraded_Lab_Revnet.ipynb
-
C4W4_LN2_image1.PNG
-
C4W4_LN2_image2.PNG
-
C4W4_LN2_image3.PNG
-
C4W4_LN2_image4.PNG
-
C4W4_LN2_image5.PNG
-
C4W4_LN2_image6.PNG
-
C4W4_LN2_image7.PNG
-
C4W4_LN2_image8.PNG
-
C4W4_LN2_image9.PNG
-
C4W4_LN2_image10.PNG
-
C4W4_LN2_image11.PNG
-
C4W4_LN2_image12.PNG
-
C4W4_LN2_image13.PNG
-
Reversible2.PNG
-
Revnet1.PNG
-
Revnet2.PNG
-
Revnet3.PNG
-
Revnet4.PNG
-
Revnet5.PNG
-
Revnet6.PNG
-
Revnet7.PNG
-
Revnet8.PNG
-
-
Reading
-
1. Tasks with Long Sequences.pdf
-
2. Optional AI Storytelling.pdf
-
3. Transformer Complexity.pdf
-
4. LSH Attention.pdf
-
5. Optional KNN _ LSH Review.pdf
-
6. Motivation for Reversible Layers_ Memory!.pdf
-
7. Reversible Residual Layers .pdf
-
8. Reformer.pdf
-
9. Optional Transformers beyond NLP.pdf
-
10. References.pdf
-
BERT.pdf
-
5. Log Likelihood, Part 1 .mp4
Views | |
---|---|
0 | Total Views |
0 | Members Views |
0 | Public Views |
Actions | |
---|---|
0 | Likes |
0 | Dislikes |
0 | Comments |
Share by mail
Please login to share this video by email.