"LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training."

Tong Zhu et al. (2024)

Details and statistics

DOI: 10.48550/ARXIV.2406.16554

access: open

type: Informal or Other Publication

metadata version: 2024-07-16