Ambos lados, revisión anteriorRevisión previaPróxima revisión | Revisión previa |
materias:pln:uba2019:teoricos [2019/09/12 20:24] – francolq | materias:pln:uba2019:teoricos [2019/09/23 18:09] (actual) – francolq |
---|
* [[https://www.youtube.com/watch?v=Z7RxBcpyN1U&list=PLQiyVNMpDLKnZYBTUOlSI9mi9wAErFtFm&index=36|Learning Sentiment Lexicons]] | * [[https://www.youtube.com/watch?v=Z7RxBcpyN1U&list=PLQiyVNMpDLKnZYBTUOlSI9mi9wAErFtFm&index=36|Learning Sentiment Lexicons]] |
| |
| |
| |
| ===== 3ra clase ===== |
| |
| |
| * Estrategias para Machine Learning: |
| * [[https://docs.google.com/presentation/d/e/2PACX-1vSjH0TlJzJpY3JeWWY_vPQpHUQnOzcg9cEMLzAcXj8cnm00l8G9_2a9L8eyB6aDWlpUgS0dOTE88j4y/pub?start=false&loop=false&delayms=3000|Parte 1]] |
| * [[https://docs.google.com/presentation/d/e/2PACX-1vQkXr831rL-O4iqzsWN1a7vqGoXSww-5qfdHomFu5AF5_hWC9QBo984Il92jbQ3LLvQmF73Pksib13m/pub?start=false&loop=false&delayms=3000|Parte 2]] |
| |
| * Notebooks: |
| * [[https://github.com/PLN-FaMAF/pln-uba-2019/blob/master/notebooks/sentiment/01%20Baseline.ipynb|01 Baseline]] |
| * [[https://github.com/PLN-FaMAF/pln-uba-2019/blob/master/notebooks/sentiment/02%20Bag%20of%20Words.ipynb|02 Bag of Words]] |
| * [[https://github.com/PLN-FaMAF/pln-uba-2019/blob/master/notebooks/sentiment/03%20Clasificador%20Basico.ipynb|03 Clasificador Basico]] |
| * [[https://github.com/PLN-FaMAF/pln-uba-2019/blob/master/notebooks/sentiment/04%20Modelos%20de%20Clasificacion.ipynb|04 Modelos de Clasificacion]] |
| * [[https://github.com/PLN-FaMAF/pln-uba-2019/tree/master/notebooks/sentiment|Más...]] |
| |
| * Charla de Rodrigo Loredo en el marco del seminario LIIA. |
| |
| * Material complementario: |
| * [[https://www.deeplearning.ai/machine-learning-yearning/|Andrew Ng. “Machine Learning Yearning”. Draft, 2018.]] |
| * [[https://karpathy.github.io/2019/04/25/recipe/|A Recipe for Training Neural Networks (Andrej Karpathy)]] |
| * [[https://github.com/EpistasisLab/tpot|TPOT: A Python Automated Machine Learning tool that optimizes machine learning pipelines using genetic programming.]] |
| |
| |
| ===== 4ta clase ===== |
| |
| /* |
| * [[http://web.stanford.edu/class/cs224n/lectures/lecture1.pdf|Introduction to NLP and Deep Learning]] ([[https://cs.famaf.unc.edu.ar/~francolq/uba2018/lecture1-2.pdf|versión corta]], [[https://www.youtube.com/watch?v=OQQ-W_63UgQ&t=2725s&index=1&list=PL3FW7Lu3i5Jsnh1rnUwq_TcylNr7EkRe6|videolecture]]) |
| */ |
| |
| * Word Vectors: |
| * [[http://web.stanford.edu/class/cs224n/slides/cs224n-2019-lecture01-wordvecs1.pdf|Word Vectors 1]] ([[https://youtu.be/8rXD5-xhemo|videolecture]]) |
| * [[http://web.stanford.edu/class/cs224n/slides/cs224n-2019-lecture02-wordvecs2.pdf|Word Vectors 2]] ([[https://youtu.be/kEMJRjEdNzM|videolecture]]) |
| |
| * Material complementario: |
| * [[https://docs.google.com/document/d/18NoNdArdzDLJFQGBMVMsQ-iLOowP1XXDaSVRmYN0IyM/edit|Frontiers in Natural Language Processing Expert Responses:]] encuesta a referentes del área. En particular: |
| * What would you say is the most influential work in NLP in the last decade, if you had to pick just one? |
| |
| |
| |
| ===== 5ta clase ===== |
| |
| * Introducción a Redes Neuronales: |
| * [[https://www.youtube.com/watch?v=8CWyBNX6eDo&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=3|Neural Networks (cs224n lecture 3)]] |
| * [[https://www.youtube.com/watch?v=yLYHDSv-288&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=4|Backpropagation (cs224n lecture 4)]] |
| |
| * Análisis sintáctico: |
| * [[https://youtu.be/nC9_RfjYwqA|Linguistic Structure: Dependency Parsing (cs224n lecture 5)]] |
| |
| * Redes Neuronales Recurrentes: |
| * [[https://www.youtube.com/watch?v=iWea12EAu6U&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=6|Language Models and RNNs (cs224n lecture 6)]] |
| * [[https://www.youtube.com/watch?v=QEw0qEa0E50&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=7|Vanishing Gradients, Fancy RNNs (cs224n lecture 7)]] |
| |
| * Links: |
| * [[http://colah.github.io/posts/2015-08-Understanding-LSTMs/|Understanding LSTM Networks]] |
| |
| |
| ===== 6ta clase ===== |
| |
| {{:materias:pln:2019:deepmeme.png?direct&400|}} |
| |
| * Traducción Automática y modelos "sequence to sequence": |
| * [[https://www.youtube.com/watch?v=XXtpJxZBa2c&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=8| Translation, Seq2Seq, Attention (cs224n lecture 8)]] |
| |
| |
| * [[https://www.youtube.com/watch?v=yIdF-17HwSk&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=11|Question Answering (cs224n lecture 10)]] |
| |
| |
| * [[https://www.youtube.com/watch?v=S-CspeZ8FHc&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=14|Contextual Word Embeddings (cs224n lecture 13)]] |
| |
| |
| * Material complementario: |
| * [[http://ruder.io/deep-learning-nlp-best-practices/index.html|Deep Learning for NLP Best Practices |
| ]] (Sebastian Ruder) |
| * [[http://nlp.seas.harvard.edu/2018/04/03/attention.html|The Annotated Transformer]] (Alexander Rush) |
| * [[https://talktotransformer.com/]] |
| * [[http://ruder.io/4-biggest-open-problems-in-nlp/|The 4 Biggest Open Problems in NLP |
| ]] (Sebastian Ruder) |