HMoE: Heterogeneous Mixture of Experts for Language Modeling

Bibliographic Details
Title: HMoE: Heterogeneous Mixture of Experts for Language Modeling
Authors: Wang, An, Sun, Xingwu, Xie, Ruobing, Li, Shuaipeng, Zhu, Jiaqi, Yang, Zhen, Zhao, Pinxue, Han, J. N., Kang, Zhanhui, Wang, Di, Okazaki, Naoaki, Xu, Cheng-zhong
Publication Year: 2024
Collection: Computer Science
Subject Terms: Computer Science - Computation and Language, Computer Science - Machine Learning
More Details: Mixture of Experts (MoE) offers remarkable performance and computational efficiency by selectively activating subsets of model parameters. Traditionally, MoE models use homogeneous experts, each with identical capacity. However, varying complexity in input data necessitates experts with diverse capabilities, while homogeneous MoE hinders effective expert specialization and efficient parameter utilization. In this study, we propose a novel Heterogeneous Mixture of Experts (HMoE), where experts differ in size and thus possess diverse capacities. This heterogeneity allows for more specialized experts to handle varying token complexities more effectively. To address the imbalance in expert activation, we propose a novel training objective that encourages the frequent activation of smaller experts, enhancing computational efficiency and parameter utilization. Extensive experiments demonstrate that HMoE achieves lower loss with fewer activated parameters and outperforms conventional homogeneous MoE models on various pre-training evaluation benchmarks. Codes will be released upon acceptance.
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2408.10681
Accession Number: edsarx.2408.10681
Database: arXiv
More Details
Description not available.