Self-attention for Text Analytics

Visualization

Reader level: Intermediate The Self-attention mechanism as shown in the paper is what will be covered in this post. This paper titled ‘A Structured Self-attentive Sentence Embedding’ is one of the best papers, IMHO, to illustrate the workings of the self-attention mechanism for Natural Language Processing. The structure of Self-attention is shown in the image below, courtesy of the paper: Suppose one has an LSTM of dim ‘u’ and takes as input batches of sentences of size ‘n’ words. [Read More]

Word2Vec in Pytorch - Continuous Bag of Words and Skipgrams

Pytorch implementation

Reader level: Intermediate Overview of Word Embeddings Word embeddings, in short, are numerical representations of text. They are represented as ‘n-dimensional’ vectors where the number of dimensions ‘n’ is determined on the corpus size and the expressiveness desired. The larger the size of your corpus, the larger you want ‘n’. A larger ‘n’ also allows you to capture more features in the embedding. However, a larger dimension involves a longer and more difficult optimization process so a sufficiently large ‘n’ is what you want to use, determining this size is often problem-specific. [Read More]

CS4984/5984 Big Data Summarization

Class notes

Connecting to ARC machines Cascades The ARC cluster that will be used for this class is ‘Cascades’. Detailed instructions on how to access this machine can be found here. A quick overview of how to login and submit jobs is given below. To login: ssh username@cascades1.arc.vt.edu where username is your PID and your password is the VT PID password followed by a comma and the two-factor six-digit code. For e.g. the password looks like this: [Read More]