embedding
理解node2vec
Written by razrlele on March 16, 2020
最近在看一些graph embedding 相关,先从node2vec入手的,在这里大概记录一下一些理解和实践。 Theory 看到embedding,第一眼就容易想到2013年Tomas Mikolov的embedding开山之作word2vec,一开始主要是用于NLP领域,基于语料库中句子序列中词与词的共现关系,来学习词的向量表征,后来大家发现不仅是NLP,在其他领域只要我们能用item构造出合理的序列,同样可以基于item之间的共现关系来学习item的向量表征,而graph embedding的大部分工作,其实就是如何构造合理的序列。
Word2Vec
Written by razrlele on October 28, 2018
Word2Vec算法是NLP领域一个里程碑式的工作,它可以通过训练把文本内容映射到一个K维的向量,这样就很方便继续在其他一些算法领域里面使用,比如推荐系统或者广告点击率预估等等。它是在2013年由当时还在Google工作的Tomas Mikolov发表,主要两篇论文是《Distributed Representations of Words and Phrases and their Compositionality》和《Efficient Estimation of Word Representations in Vector Space》.
-
Recent/
- Tags/2016 2017 acm algorithm android aoapc apple archlinux bellman-ford brute force c++ cat codeforces coding deep learning devops dfs diary dynamic programming embedding fcitx git graph hungarian json life linux machine learning mathematic movies poj python shell ssh sublime sum toefl writing travel ubuntu union-find uvaoj vim vps windows zuo
- Categories/
- Archives/
- January 2023 (1)
- December 2022 (1)
- February 2021 (1)
- September 2020 (1)
- March 2020 (2)
- January 2020 (1)
- December 2019 (1)
- May 2019 (1)
- February 2019 (2)
- November 2018 (1)
- October 2018 (2)
- September 2018 (1)
- August 2018 (3)
- July 2018 (1)
- April 2018 (1)
- November 2017 (2)
- September 2017 (1)
- August 2017 (1)
- July 2017 (2)
- May 2017 (2)
- April 2017 (1)
- February 2017 (1)
- January 2017 (2)
- December 2016 (2)
- November 2016 (3)
- October 2016 (3)
- September 2016 (1)
- July 2016 (2)
- June 2016 (1)
- May 2016 (3)
- April 2016 (3)
- March 2016 (5)
- February 2016 (5)
- January 2016 (2)
- December 2015 (5)
- November 2015 (4)
- October 2015 (10)
- September 2015 (1)
- August 2015 (3)
- July 2015 (2)
- June 2015 (2)
- April 2015 (1)
- March 2015 (15)
- February 2015 (20)
- January 2015 (10)
- December 2014 (14)
- November 2014 (5)
- October 2014 (7)
- September 2014 (1)
- August 2014 (15)
- July 2014 (9)
- June 2014 (8)
- May 2014 (1)
- March 2014 (1)