官术网_书友最值得收藏!

Chapter 4. Advanced Word2vec

In Chapter 3, Word2vec – Learning Word Embeddings, we introduced you to Word2vec, the basics of learning word embeddings, and the two common Word2vec algorithms: skip-gram and CBOW. In this chapter, we will discuss several topics related to Word2vec, focusing on these two algorithms and extensions.

First, we will explore how the original skip-gram algorithm was implemented and how it compares to its more modern variant, which we used in Chapter 3, Word2vec – Learning Word Embeddings. We will examine the differences between skip-gram and CBOW and look at the behavior of the loss over time of the two approaches. We will also discuss which method works better, using both our observation and the available literature.

We will discuss several extensions to the existing Word2vec methods that boost performance. These extensions include using more effective sampling techniques to sample negative examples for negative sampling and ignoring uninformative words in the learning process, among others. You will also learn a novel word embedding learning technique known as Global Vectors (GloVe) and the specific advantages that GloVe has over skip-gram and CBOW.

Finally, you will learn how to use Word2vec to solve a real-world problem: document classification. We will see this with a simple trick of obtaining document embeddings from word embeddings.

主站蜘蛛池模板: 凤庆县| 陇南市| 长丰县| 长乐市| 开化县| 普洱| 海口市| 都兰县| 瓮安县| 仁化县| 天柱县| 仙游县| 新化县| 洛隆县| 吴桥县| 依安县| 秦安县| 宜兰县| 龙口市| 云浮市| 始兴县| 广汉市| 孝感市| 大新县| 博爱县| 平度市| 吉首市| 淳安县| 佛冈县| 连城县| 栾川县| 家居| 榆中县| 石棉县| 石渠县| 安溪县| 松溪县| 古浪县| 娱乐| 阳东县| 邳州市|