


default search action
"LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training."
Tong Zhu et al. (2024)
- Tong Zhu, Xiaoye Qu, Daize Dong, Jiacheng Ruan, Jingqi Tong, Conghui He, Yu Cheng:
LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training. CoRR abs/2406.16554 (2024)

manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.