Skip to content

Commit ea0620a

Browse files
authored
Merge pull request #46 from AndrewRook/figure_ref_fixes
2 parents b839318 + 6ca684c commit ea0620a

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

embeddings.tex

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1633,7 +1633,7 @@ \subsection{Word2Vec}
16331633
)
16341634
return vocab
16351635
\end{minted}
1636-
\caption{Processing our input vocabulary and building a Vocabulary object from our dataset in PyTorch \href{https://github.com/veekaybee/viberary/tree/main/word2vec}{source}}
1636+
\caption{Processing our input vocabulary and building a Vocabulary object from our dataset in PyTorch}
16371637
\end{figure}
16381638
16391639
Now that we have an input vocabulary object we can work with, the next step is to create one-hot encodings of each word to a numerical position, and each position back to a word, so that we can easily reference both our words and vectors. The goal is to be able to map back and forth when we do lookups and retrieval.
@@ -1679,7 +1679,7 @@ \subsection{Word2Vec}
16791679
self.linear2 = nn.Linear(128, self.vocab_size)
16801680
self.activation_function2 = nn.LogSoftmax(dim=-1)
16811681
\end{minted}
1682-
\caption{Word2Vec CBOW implementation in Pytorch. \href{https://github.com/veekaybee/viberary/tree/main/word2vec}{source}}
1682+
\caption{Word2Vec CBOW implementation in Pytorch. \href{https://github.com/veekaybee/viberary/blob/722c4b576bf1bdaf59c806661b23bd3afaabf883/src/notebooks/cbow.ipynb#L12}{source}}
16831683
\end{figure}
16841684
16851685

0 commit comments

Comments
 (0)