Ambos lados, revisión anteriorRevisión previaPróxima revisión | Revisión previa |
materias:pln:uba2019:teoricos [2019/09/18 19:55] – francolq | materias:pln:uba2019:teoricos [2019/09/23 18:09] (actual) – francolq |
---|
* [[http://web.stanford.edu/class/cs224n/slides/cs224n-2019-lecture02-wordvecs2.pdf|Word Vectors 2]] ([[https://youtu.be/kEMJRjEdNzM|videolecture]]) | * [[http://web.stanford.edu/class/cs224n/slides/cs224n-2019-lecture02-wordvecs2.pdf|Word Vectors 2]] ([[https://youtu.be/kEMJRjEdNzM|videolecture]]) |
| |
| * Material complementario: |
| * [[https://docs.google.com/document/d/18NoNdArdzDLJFQGBMVMsQ-iLOowP1XXDaSVRmYN0IyM/edit|Frontiers in Natural Language Processing Expert Responses:]] encuesta a referentes del área. En particular: |
| * What would you say is the most influential work in NLP in the last decade, if you had to pick just one? |
| |
| |
| |
| ===== 5ta clase ===== |
| |
| * Introducción a Redes Neuronales: |
| * [[https://www.youtube.com/watch?v=8CWyBNX6eDo&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=3|Neural Networks (cs224n lecture 3)]] |
| * [[https://www.youtube.com/watch?v=yLYHDSv-288&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=4|Backpropagation (cs224n lecture 4)]] |
| |
| * Análisis sintáctico: |
| * [[https://youtu.be/nC9_RfjYwqA|Linguistic Structure: Dependency Parsing (cs224n lecture 5)]] |
| |
| * Redes Neuronales Recurrentes: |
| * [[https://www.youtube.com/watch?v=iWea12EAu6U&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=6|Language Models and RNNs (cs224n lecture 6)]] |
| * [[https://www.youtube.com/watch?v=QEw0qEa0E50&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=7|Vanishing Gradients, Fancy RNNs (cs224n lecture 7)]] |
| |
| * Links: |
| * [[http://colah.github.io/posts/2015-08-Understanding-LSTMs/|Understanding LSTM Networks]] |
| |
| |
| ===== 6ta clase ===== |
| |
| {{:materias:pln:2019:deepmeme.png?direct&400|}} |
| |
| * Traducción Automática y modelos "sequence to sequence": |
| * [[https://www.youtube.com/watch?v=XXtpJxZBa2c&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=8| Translation, Seq2Seq, Attention (cs224n lecture 8)]] |
| |
| |
| * [[https://www.youtube.com/watch?v=yIdF-17HwSk&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=11|Question Answering (cs224n lecture 10)]] |
| |
| |
| * [[https://www.youtube.com/watch?v=S-CspeZ8FHc&list=PLoROMvodv4rOhcuXMZkNm7j3fVwBBY42z&index=14|Contextual Word Embeddings (cs224n lecture 13)]] |
| |
| |
| * Material complementario: |
| * [[http://ruder.io/deep-learning-nlp-best-practices/index.html|Deep Learning for NLP Best Practices |
| ]] (Sebastian Ruder) |
| * [[http://nlp.seas.harvard.edu/2018/04/03/attention.html|The Annotated Transformer]] (Alexander Rush) |
| * [[https://talktotransformer.com/]] |
| * [[http://ruder.io/4-biggest-open-problems-in-nlp/|The 4 Biggest Open Problems in NLP |
| ]] (Sebastian Ruder) |