this post was submitted on 30 Nov 2023
1 points (100.0% liked)

Machine Learning

1 readers
1 users here now

Community Rules:

founded 1 year ago
MODERATORS
 

Paper: https://arxiv.org/abs/2311.04823

Code: https://github.com/OpenNLPLab/HGRN

Models: https://huggingface.co/OpenNLPLab

Abstract:

Transformers have surpassed RNNs in popularity due to their superior abilities in parallel training and long-term dependency modeling. Recently, there has been a renewed interest in using linear RNNs for efficient sequence modeling. These linear RNNs often employ gating mechanisms in the output of the linear recurrence layer while ignoring the significance of using forget gates within the recurrence. In this paper, we propose a gated linear RNN model dubbed Hierarchically Gated Recurrent Neural Network (HGRN), which includes forget gates that are lower bounded by a learnable value. The lower bound increases monotonically when moving up layers. This allows the upper layers to model long-term dependencies and the lower layers to model more local, short-term dependencies. Experiments on language modeling, image classification, and long-range arena benchmarks showcase the efficiency and effectiveness of our proposed model. The source code is available at this https URL.

https://preview.redd.it/thph9bpmjh3c1.png?width=965&format=png&auto=webp&s=8e4871cd280ef7e5b771b463435d47da11dca52d

you are viewing a single comment's thread
view the rest of the comments
[โ€“] CatalyzeX_code_bot@alien.top 1 points 11 months ago

Found 1 relevant code implementation for "Hierarchically Gated Recurrent Neural Network for Sequence Modeling".

Ask the author(s) a question about the paper or code.

If you have code to share with the community, please add it here ๐Ÿ˜Š๐Ÿ™

--

To opt out from receiving code links, DM me.