作者回复: 推荐关注我的推荐系统paper list吧,经典的不能再经典了。https://github.com/wzhe06/Reco-papers
作者回复: 最直接的方式是直接concatenate 再交由后续多层神经网络处理。
为了实现一些重点embedding之间的交叉,也可以进行多embedding之间的element wise交叉,或者一些乘积操作后输入后续mlp。
基于emb的操作非常多,这里没有统一的答案,也是各种模型层出不穷的原因。
作者回复: 会讲GraphSAGE的原理和细节。
作者回复: 这是非常常见的一个问题,也推荐其他有疑问的同学关注。
推荐参考原文中的解释。
We observe that BFS and DFS strategies play a key role in producing representations that reflect either of the above equivalences.
In particular, the neighborhoods sampled by BFS lead to embeddings that correspond closely to structural equivalence.
The opposite is true for DFS which can explore larger parts of the network as it can move further away from the source node u (with sample size k being fixed).
In DFS, the sampled nodes more accurately reflect a macro-view of the neighborhood which is essential in inferring communities based on homophily.
原文地址 https://github.com/wzhe06/Reco-papers/blob/master/Embedding/%5BNode2vec%5D%20Node2vec%20-%20Scalable%20Feature%20Learning%20for%20Networks%20%28Stanford%202016%29.pdf