-
Nemotron-4 15B Technical Report
Paper ā¢ 2402.16819 ā¢ Published ā¢ 43 -
Griffin: Mixing Gated Linear Recurrences with Local Attention for Efficient Language Models
Paper ā¢ 2402.19427 ā¢ Published ā¢ 53 -
RWKV: Reinventing RNNs for the Transformer Era
Paper ā¢ 2305.13048 ā¢ Published ā¢ 15 -
Reformer: The Efficient Transformer
Paper ā¢ 2001.04451 ā¢ Published
Collections
Discover the best community collections!
Collections including paper arxiv:2311.16867
-
Attention Is All You Need
Paper ā¢ 1706.03762 ā¢ Published ā¢ 50 -
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Paper ā¢ 1810.04805 ā¢ Published ā¢ 16 -
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Paper ā¢ 1907.11692 ā¢ Published ā¢ 7 -
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Paper ā¢ 1910.01108 ā¢ Published ā¢ 14
-
Mamba: Linear-Time Sequence Modeling with Selective State Spaces
Paper ā¢ 2312.00752 ā¢ Published ā¢ 139 -
Schrodinger Bridges Beat Diffusion Models on Text-to-Speech Synthesis
Paper ā¢ 2312.03491 ā¢ Published ā¢ 33 -
Order Matters in the Presence of Dataset Imbalance for Multilingual Learning
Paper ā¢ 2312.06134 ā¢ Published ā¢ 2 -
LLM in a flash: Efficient Large Language Model Inference with Limited Memory
Paper ā¢ 2312.11514 ā¢ Published ā¢ 257
-
Cognitive Architectures for Language Agents
Paper ā¢ 2309.02427 ā¢ Published ā¢ 8 -
Direct Preference Optimization: Your Language Model is Secretly a Reward Model
Paper ā¢ 2305.18290 ā¢ Published ā¢ 52 -
Orca 2: Teaching Small Language Models How to Reason
Paper ā¢ 2311.11045 ā¢ Published ā¢ 71 -
Pretraining Data Mixtures Enable Narrow Model Selection Capabilities in Transformer Models
Paper ā¢ 2311.00871 ā¢ Published ā¢ 2
-
Attention Is All You Need
Paper ā¢ 1706.03762 ā¢ Published ā¢ 50 -
FlashAttention-2: Faster Attention with Better Parallelism and Work Partitioning
Paper ā¢ 2307.08691 ā¢ Published ā¢ 8 -
Mixtral of Experts
Paper ā¢ 2401.04088 ā¢ Published ā¢ 158 -
Mistral 7B
Paper ā¢ 2310.06825 ā¢ Published ā¢ 47