Awesome Mixture of Experts (MoE): A Curated List of Mixture of Experts (MoE) and Mixture of Multimodal Experts (MoME)
artificial-intelligence
sparse
moe
load-balancing
multimodal-learning
mixture-of-experts
mome
gating-network
foundation-models
large-language-models
llms
large-language-model
large-vision-language-models
expert-network
llms-reasoning
llms-benchmarking
mixtrure-of-multimodal-experts
sparse-moe
sparse-mixture-of-experts
sparse-mixture-of-multimodal-experts
-
Updated
Sep 25, 2024