Pre-gated MoE: An Algorithm-System Co-Design for Fast and Scalable Mixture-of-Expert Inference Paper • 2308.12066 • Published Aug 23, 2023 • 4
EvoMoE: An Evolutional Mixture-of-Experts Training Framework via Dense-To-Sparse Gate Paper • 2112.14397 • Published Dec 29, 2021 • 1