🎯
Focusing
Engineer in Inspur
-
University of Chinese Academy of Sciences
- Beijing
-
17:17
(UTC -12:00)
Popular repositories Loading
-
multi-head-self-attention
multi-head-self-attention Public在sts数据集上用多头注意力机制上进行测试。 pytorch torchtext 代码简练,非常适合新手了解多头注意力机制的运作。不想transformer牵扯很多层 multi-head attention + one layer linear
-
-
transformer-with-annotation
transformer-with-annotation Publictransformer translation with annotation
Python 5
-
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.