240 發(fā)簡信
IP屬地:浙江
  • BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 論文筆記

    主要結(jié)構(gòu)仍是 Transformer Input:a. WordPiece embeddingsb. learned positional embeddings, up...

  • Linguistically-Informed Self-Attention for Semantic Role Labeling 論文筆記

    jointly predict parts of speech and predicatesparts of speech 詞性標(biāo)注predicates 謂語標(biāo)注,是Sema...

  • Character-Level Language Modeling with Deeper Self-Attention 論文筆記

    1, Self-Attention,用了Transformer architecture 2, Deep, 64個(gè)Transformer layers 3, 加Auxilia...

  • Win10和Ubuntu16雙系統(tǒng),以及CUDA安裝

    Win10和Ubuntu16雙系統(tǒng) 1畔师,正常安裝Win10。 2,進(jìn)入 此電腦--管理--磁盤管理普碎,壓縮卷或者刪除卷,為ubuntu預(yù)留磁盤空間录平。 3麻车,UltraISO制作U...

亚洲A日韩AV无卡,小受高潮白浆痉挛av免费观看,成人AV无码久久久久不卡网站,国产AV日韩精品