Router (1) 썸네일형 리스트형 [MoH 논문 리뷰] - MULTI-HEAD ATTENTION AS MIXTURE-OF-HEAD ATTENTION *MoH를 위한 논문 리뷰 글입니다! 궁금하신 점은 댓글로 남겨주세요! MoH paper: [2410.11842] MoH: Multi-Head Attention as Mixture-of-Head Attention (arxiv.org) MoH: Multi-Head Attention as Mixture-of-Head AttentionIn this work, we upgrade the multi-head attention mechanism, the core of the Transformer model, to improve efficiency while maintaining or surpassing the previous accuracy level. We show that multi-head attentio.. 이전 1 다음