Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

a liitle question about loading the pre-embedding #1

Open
gwaterCy opened this issue Aug 9, 2018 · 2 comments
Open

a liitle question about loading the pre-embedding #1

gwaterCy opened this issue Aug 9, 2018 · 2 comments

Comments

@gwaterCy
Copy link

gwaterCy commented Aug 9, 2018

你好,在代码中您是加载了本地的一个pre-embedding的文件,请问下这个文件是有使用什么预训练方法吗?还是就是随机的初始化?

@uestcnlp
Copy link
Owner

uestcnlp commented Aug 15, 2018

你好,是随机初始化的,然后在模型训练时进行更新。为了方便实验的统一,我们保留了最近一次随机初始化后的embedding来载入。

@DebonairLi
Copy link

你好,是随机初始化的,然后在模型训练时进行更新。为了方便实验的统一,我们保留了最近一次随机初始化后的embedding来载入。

你好,我想问一下,在随机初始化embedding之后,没有用RNN去学习embedding吧?如果加上RNN效果会不会有所提升呢?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants