Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

(先不要关注)课程内容中的语法错误等 #32

Open
yeyeyeyeeeee opened this issue Sep 15, 2021 · 3 comments
Open

(先不要关注)课程内容中的语法错误等 #32

yeyeyeyeeeee opened this issue Sep 15, 2021 · 3 comments

Comments

@yeyeyeyeeeee
Copy link

No description provided.

@yeyeyeyeeeee
Copy link
Author

2.2中的Transformer结构细节中输入处理部分,“那么就填充先填充...”

@yeyeyeyeeeee yeyeyeyeeeee changed the title 课程内容中的语法错误等 (先不要关注)课程内容中的语法错误等 Sep 15, 2021
@yeyeyeyeeeee
Copy link
Author

Self-Attention细节 公式部分“W^Q,W^K,W^K” 修改为“W^Q,W^K,W^V”

@IceCapriccio
Copy link

2.2节错误参考:

  1. "先将Transformer这种特殊的seqseq模型" -> "先将Transformer这种特殊的seq2seq模型"
  2. "而Self Attention机制值得是" -> "而Self Attention的机制是"
  3. ""it"在模型中的表示,融合了"animal"和"tire"的部分表达。" -> ""it"在模型中的表示,融合了"animal"和"tired"的部分表达。"
  4. 在"Self-Attention细节"一小节中的四行公式中,步骤2-3一行里,计算score_xy的分子,应该是q_1 * k_1等等

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants