Ambos lados, revisión anteriorRevisión previaPróxima revisión | Revisión previa |
materias:pln:uba2019:teoricos [2019/09/10 22:35] – francolq | materias:pln:uba2019:teoricos [2019/09/23 18:09] (actual) – francolq |
---|
* [[https://www.youtube.com/watch?v=ZbHFLgBWgdQ&list=PLQiyVNMpDLKnZYBTUOlSI9mi9wAErFtFm&index=16|Suavizado "add one"]] | * [[https://www.youtube.com/watch?v=ZbHFLgBWgdQ&list=PLQiyVNMpDLKnZYBTUOlSI9mi9wAErFtFm&index=16|Suavizado "add one"]] |
* [[https://www.youtube.com/watch?v=naNezonMA7k&list=PLQiyVNMpDLKnZYBTUOlSI9mi9wAErFtFm&index=17|Interpolación]] | * [[https://www.youtube.com/watch?v=naNezonMA7k&list=PLQiyVNMpDLKnZYBTUOlSI9mi9wAErFtFm&index=17|Interpolación]] |
| |
| * Notebooks: |
| * [[https://github.com/PLN-FaMAF/pln-uba-2019/blob/master/notebooks/01%20Procesamiento%20B%C3%A1sico%20de%20Texto.ipynb|01 Procesamiento Básico de Texto]] |
| * [[https://github.com/PLN-FaMAF/pln-uba-2019/blob/master/notebooks/02%20Modelado%20de%20Lenguaje.ipynb|02 Modelado de Lenguaje]] |
| |
* Material complementario: | * Material complementario: |
| |
| |
| ===== 2da clase ===== |
| |
| |
| * Etiquetado de secuencias: |
| * [[https://www.youtube.com/watch?v=JhJU0Akkqzo&list=PLQiyVNMpDLKnZYBTUOlSI9mi9wAErFtFm&index=56|Introducción al Part-of-Speech (POS) tagging]] |
| * [[https://www.youtube.com/watch?v=Zm_bmRhbaQg&list=PLQiyVNMpDLKnZYBTUOlSI9mi9wAErFtFm&index=57|Algunos métodos y resultados]] |
| * Más sobre etiquetado de secuencias (curso de Collins): |
| * [[https://www.youtube.com/watch?v=6jUva-eD-xY&list=PLlQBy7xY8mbI13gwXZz4r55MeatSZOqm7&index=1|Etiquetado de secuencias]] |
| * [[https://www.youtube.com/watch?v=VMZM7AYjEsg&list=PLlQBy7xY8mbI13gwXZz4r55MeatSZOqm7&index=2|Modelos generativos para el aprendizaje supervisado]] |
| * [[https://www.youtube.com/watch?v=cAJmM5k62yM&list=PLlQBy7xY8mbI13gwXZz4r55MeatSZOqm7&index=3|Introducción a los Hidden Markov Models (HMMs)]] |
| * [[https://www.youtube.com/watch?v=uAT3iJpQwJ0&list=PLlQBy7xY8mbI13gwXZz4r55MeatSZOqm7&index=4|Estimación de parámetros de HMMs]] |
| * [[https://www.youtube.com/watch?v=ECu_KQV3V30&list=PLlQBy7xY8mbI13gwXZz4r55MeatSZOqm7&index=5|5. The Viterbi Algorithm for HMMs - Part I]] |
| * [[https://www.youtube.com/watch?v=WqGUa54x8wE&list=PLlQBy7xY8mbI13gwXZz4r55MeatSZOqm7&index=6|6. The Viterbi Algorithm for HMMs - Part II]] |
| * [[https://www.youtube.com/watch?v=Bu7oSlNCmdU&list=PLlQBy7xY8mbI13gwXZz4r55MeatSZOqm7&index=7|7. The Viterbi Algorithm for HMMs - Part III]] |
| * [[https://www.youtube.com/watch?v=Y5hXE23Tdzc&list=PLlQBy7xY8mbI13gwXZz4r55MeatSZOqm7&index=8|8. Summary]] |
| |
| |
| * Notebooks: |
| * [[https://github.com/PLN-FaMAF/pln-uba-2019/blob/master/notebooks/tagging/01%20Etiquetado%20de%20Secuencias%20Parte%201.ipynb|01 Etiquetado de Secuencias Parte 1]] |
| * [[https://github.com/PLN-FaMAF/pln-uba-2019/blob/master/notebooks/tagging/02%20Etiquetado%20de%20Secuencias%20Parte%202.ipynb|02 Etiquetado de Secuencias Parte 2]] |
| * [[https://github.com/PLN-FaMAF/pln-uba-2019/blob/master/notebooks/tagging/03%20Etiquetado%20de%20Secuencias%20Parte%203.ipynb|03 Etiquetado de Secuencias Parte 3]] |
| |
| |
| |
| {{:materias:pln:2019:errorsmeme.png?direct&400|}} |
| |
| * Clasificación de texto |
| * [[https://www.youtube.com/watch?v=kxImnFg4ZiQ&list=PLQiyVNMpDLKnZYBTUOlSI9mi9wAErFtFm&index=24|Qué es clasificación de texto]] |
| * [[https://www.youtube.com/watch?v=j39c7Gjx2gE&list=PLQiyVNMpDLKnZYBTUOlSI9mi9wAErFtFm&index=25|Naive Bayes]] |
| * [[https://www.youtube.com/watch?v=VNEdufXVMaU&index=26&list=PLQiyVNMpDLKnZYBTUOlSI9mi9wAErFtFm|Formalizando el clasificador Naive Bayes]] |
| * [[https://www.youtube.com/watch?v=3jR8TZG8T88&index=27&list=PLQiyVNMpDLKnZYBTUOlSI9mi9wAErFtFm|Aprendizaje de Naive Bayes]] |
| * [[https://www.youtube.com/watch?v=LRFdF9J__Tc&index=28&list=PLQiyVNMpDLKnZYBTUOlSI9mi9wAErFtFm|Relación de Naive Bayes con modelado de lenguaje]] |
| * [[https://www.youtube.com/watch?v=OWGVQfuvNMk&list=PLQiyVNMpDLKnZYBTUOlSI9mi9wAErFtFm&index=29|Multinomial Naive Bayes: Ejemplo completo]] |
| * [[https://www.youtube.com/watch?v=81j2nzzBHUw&list=PLQiyVNMpDLKnZYBTUOlSI9mi9wAErFtFm&index=30|Precision, recall y F1]] |
| * [[https://www.youtube.com/watch?v=TdkWIxGoiak&list=PLQiyVNMpDLKnZYBTUOlSI9mi9wAErFtFm&index=31|Evaluación de clasificación de texto]] |
| |
| |
| * Análisis de Sentimiento: |
| * [[https://www.youtube.com/watch?v=vy0HC5H-484&list=PLQiyVNMpDLKnZYBTUOlSI9mi9wAErFtFm&index=33|What is Sentiment Analysis]] |
| * [[https://www.youtube.com/watch?v=Dgqt62RQMaY&list=PLQiyVNMpDLKnZYBTUOlSI9mi9wAErFtFm&index=34|Sentiment Analysis- A baseline algorithm]] |
| * [[https://www.youtube.com/watch?v=wBE0FE_2ddE&list=PLQiyVNMpDLKnZYBTUOlSI9mi9wAErFtFm&index=35|Sentiment Lexicons]] |
| * [[https://www.youtube.com/watch?v=Z7RxBcpyN1U&list=PLQiyVNMpDLKnZYBTUOlSI9mi9wAErFtFm&index=36|Learning Sentiment Lexicons]] |
| |
| |
| |
| ===== 3ra clase ===== |
| |
| |
| * Estrategias para Machine Learning: |
| * [[https://docs.google.com/presentation/d/e/2PACX-1vSjH0TlJzJpY3JeWWY_vPQpHUQnOzcg9cEMLzAcXj8cnm00l8G9_2a9L8eyB6aDWlpUgS0dOTE88j4y/pub?start=false&loop=false&delayms=3000|Parte 1]] |
| * [[https://docs.google.com/presentation/d/e/2PACX-1vQkXr831rL-O4iqzsWN1a7vqGoXSww-5qfdHomFu5AF5_hWC9QBo984Il92jbQ3LLvQmF73Pksib13m/pub?start=false&loop=false&delayms=3000|Parte 2]] |
| |
| * Notebooks: |
| * [[https://github.com/PLN-FaMAF/pln-uba-2019/blob/master/notebooks/sentiment/01%20Baseline.ipynb|01 Baseline]] |
| * [[https://github.com/PLN-FaMAF/pln-uba-2019/blob/master/notebooks/sentiment/02%20Bag%20of%20Words.ipynb|02 Bag of Words]] |
| * [[https://github.com/PLN-FaMAF/pln-uba-2019/blob/master/notebooks/sentiment/03%20Clasificador%20Basico.ipynb|03 Clasificador Basico]] |
| * [[https://github.com/PLN-FaMAF/pln-uba-2019/blob/master/notebooks/sentiment/04%20Modelos%20de%20Clasificacion.ipynb|04 Modelos de Clasificacion]] |
| * [[https://github.com/PLN-FaMAF/pln-uba-2019/tree/master/notebooks/sentiment|Más...]] |
| |
| * Charla de Rodrigo Loredo en el marco del seminario LIIA. |
| |
| * Material complementario: |
| * [[https://www.deeplearning.ai/machine-learning-yearning/|Andrew Ng. “Machine Learning Yearning”. Draft, 2018.]] |
| * [[https://karpathy.github.io/2019/04/25/recipe/|A Recipe for Training Neural Networks (Andrej Karpathy)]] |
| * [[https://github.com/EpistasisLab/tpot|TPOT: A Python Automated Machine Learning tool that optimizes machine learning pipelines using genetic programming.]] |
| |
| |
| ===== 4ta clase ===== |
| |
| /* |
| * [[http://web.stanford.edu/class/cs224n/lectures/lecture1.pdf|Introduction to NLP and Deep Learning]] ([[https://cs.famaf.unc.edu.ar/~francolq/uba2018/lecture1-2.pdf|versión corta]], [[https://www.youtube.com/watch?v=OQQ-W_63UgQ&t=2725s&index=1&list=PL3FW7Lu3i5Jsnh1rnUwq_TcylNr7EkRe6|videolecture]]) |
| */ |
| |
| * Word Vectors: |
| * [[http://web.stanford.edu/class/cs224n/slides/cs224n-2019-lecture01-wordvecs1.pdf|Word Vectors 1]] ([[https://youtu.be/8rXD5-xhemo|videolecture]]) |
| * [[http://web.stanford.edu/class/cs224n/slides/cs224n-2019-lecture02-wordvecs2.pdf|Word Vectors 2]] ([[https://youtu.be/kEMJRjEdNzM|videolecture]]) |
| |
| * Material complementario: |
| * [[https://docs.google.com/document/d/18NoNdArdzDLJFQGBMVMsQ-iLOowP1XXDaSVRmYN0IyM/edit|Frontiers in Natural Language Processing Expert Responses:]] encuesta a referentes del área. En particular: |
| * What would you say is the most influential work in NLP in the last decade, if you had to pick just one? |
| |
| |
| |
| ===== 5ta clase ===== |
| |
| * Introducción a Redes Neuronales: |
| * [[https://www.youtube.com/watch?v=8CWyBNX6eDo&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=3|Neural Networks (cs224n lecture 3)]] |
| * [[https://www.youtube.com/watch?v=yLYHDSv-288&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=4|Backpropagation (cs224n lecture 4)]] |
| |
| * Análisis sintáctico: |
| * [[https://youtu.be/nC9_RfjYwqA|Linguistic Structure: Dependency Parsing (cs224n lecture 5)]] |
| |
| * Redes Neuronales Recurrentes: |
| * [[https://www.youtube.com/watch?v=iWea12EAu6U&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=6|Language Models and RNNs (cs224n lecture 6)]] |
| * [[https://www.youtube.com/watch?v=QEw0qEa0E50&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=7|Vanishing Gradients, Fancy RNNs (cs224n lecture 7)]] |
| |
| * Links: |
| * [[http://colah.github.io/posts/2015-08-Understanding-LSTMs/|Understanding LSTM Networks]] |
| |
| |
| ===== 6ta clase ===== |
| |
| {{:materias:pln:2019:deepmeme.png?direct&400|}} |
| |
| * Traducción Automática y modelos "sequence to sequence": |
| * [[https://www.youtube.com/watch?v=XXtpJxZBa2c&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=8| Translation, Seq2Seq, Attention (cs224n lecture 8)]] |
| |
| |
| * [[https://www.youtube.com/watch?v=yIdF-17HwSk&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=11|Question Answering (cs224n lecture 10)]] |
| |
| |
| * [[https://www.youtube.com/watch?v=S-CspeZ8FHc&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=14|Contextual Word Embeddings (cs224n lecture 13)]] |
| |
| |
| * Material complementario: |
| * [[http://ruder.io/deep-learning-nlp-best-practices/index.html|Deep Learning for NLP Best Practices |
| ]] (Sebastian Ruder) |
| * [[http://nlp.seas.harvard.edu/2018/04/03/attention.html|The Annotated Transformer]] (Alexander Rush) |
| * [[https://talktotransformer.com/]] |
| * [[http://ruder.io/4-biggest-open-problems-in-nlp/|The 4 Biggest Open Problems in NLP |
| ]] (Sebastian Ruder) |