본문 바로가기

lssl

(3)
[Mamba 논문 리뷰 3] - S4: Efficiently Modeling Long Sequences with Structured State Spaces *Mamba 논문 리뷰 시리즈3 입니다! 궁금하신 점은 댓글로 남겨주세요!시리즈 1: Hippo시리즈 2: LSSL시리즈 3: S4시리즈 4: Mamba시리즈 5: Vision MambaS4 paper: [2111.00396] Efficiently Modeling Long Sequences with Structured State Spaces (arxiv.org)  Efficiently Modeling Long Sequences with Structured State SpacesA central goal of sequence modeling is designing a single principled model that can address sequence data across a range of modal..
[Mamba 논문 리뷰 2] - LSSL: Combining Recurrent, Convolutional, and Continuous-time Models with Linear State-Space Layers *Mamba 논문 리뷰 시리즈2 입니다! 궁금하신 점은 댓글로 남겨주세요!시리즈 1: Hippo시리즈 2: LSSL시리즈 3: S4시리즈 4: Mamba시리즈 5: Vision MambaLSSL paper: [2110.13985] Combining Recurrent, Convolutional, and Continuous-time Models with Linear State-Space Layers (arxiv.org)  Combining Recurrent, Convolutional, and Continuous-time Models with Linear State-Space LayersRecurrent neural networks (RNNs), temporal convolutions, and neural d..
[Mamba 논문 리뷰 1] - HiPPO: Recurrent Memory with Optimal Polynomial Projections *Mamba 논문 리뷰 시리즈1 입니다! 궁금하신 점은 댓글로 남겨주세요!시리즈 1: Hippo시리즈 2: LSSL시리즈 3: S4시리즈 4: Mamba시리즈 5: Vision MambaHiPPO paper: https://arxiv.org/abs/2008.07669 HiPPO: Recurrent Memory with Optimal Polynomial ProjectionsA central problem in learning from sequential data is representing cumulative history in an incremental fashion as more data is processed. We introduce a general framework (HiPPO) for the o..

반응형