[MoEs] Mixture of Experts

Author

{Scaling}

Paper: https://huggingface.co/blog/moe 

Code: https://huggingface.co/models?search=mistralai/Mixtral