Slides for the Big Data Symposium
Self-attention for Text Analytics
Visualization
Reader level: Intermediate The Self-attention mechanism as shown in the paper is what will be covered in this post. This paper titled ‘A Structured Self-attentive Sentence Embedding’ is one of the best papers, IMHO, to illustrate the workings of the self-attention mechanism for Natural Language Processing. The structure of Self-attention is shown in the image below, courtesy of the paper:
Suppose one has an LSTM of dim ‘u’ and takes as input batches of sentences of size ‘n’ words.
[Read More]
Introduction to Machine Learning
NLI Class slides
This page contains slides to the ‘Introduction to Machine Learning’ NLI class series that I have taught at Virginia Tech.