官术网_书友最值得收藏!

Chapter 4. Advanced Word2vec

In Chapter 3, Word2vec – Learning Word Embeddings, we introduced you to Word2vec, the basics of learning word embeddings, and the two common Word2vec algorithms: skip-gram and CBOW. In this chapter, we will discuss several topics related to Word2vec, focusing on these two algorithms and extensions.

First, we will explore how the original skip-gram algorithm was implemented and how it compares to its more modern variant, which we used in Chapter 3, Word2vec – Learning Word Embeddings. We will examine the differences between skip-gram and CBOW and look at the behavior of the loss over time of the two approaches. We will also discuss which method works better, using both our observation and the available literature.

We will discuss several extensions to the existing Word2vec methods that boost performance. These extensions include using more effective sampling techniques to sample negative examples for negative sampling and ignoring uninformative words in the learning process, among others. You will also learn a novel word embedding learning technique known as Global Vectors (GloVe) and the specific advantages that GloVe has over skip-gram and CBOW.

Finally, you will learn how to use Word2vec to solve a real-world problem: document classification. We will see this with a simple trick of obtaining document embeddings from word embeddings.

主站蜘蛛池模板: 松江区| 蒲城县| 水富县| 夹江县| 西和县| 六枝特区| 大庆市| 富阳市| 靖西县| 南宁市| 开原市| 来凤县| 太和县| 偏关县| 齐齐哈尔市| 浮山县| 海丰县| 湘潭市| 淳安县| 灵山县| 杭锦后旗| 安康市| 彩票| 来凤县| 紫金县| 西畴县| 双柏县| 冀州市| 修武县| 德保县| 浦北县| 新丰县| 吉隆县| 陕西省| 鹤壁市| 洞口县| 佛冈县| 阜平县| 六安市| 盘山县| 仙桃市|