DailyHum Logo
Microsoft’s Tutel optimizes mixture of experts model training
Nov 23, 2021 | Venture Beat
Picture - Microsoft’s Tutel optimizes mixture of experts model training

Let the OSS Enterprise newsletter guide your open source journey! Sign up here.

Microsoft this week announced Tutel, a library to support the development of mixture of experts (MoE) models — a particular type of large-scale AI model. Tutel, which is open source and has been integrated into fairseq, one of Facebook’s toolkits in PyTorch, is designed to enable developers across AI disciplines to “execute MoE more easily and efficiently,” Microsoft says.

MoE are made up of small clusters of “neurons” that are only active under special, specific circumstances. Lower “layers” of the MoE model extract features and experts are called upon to evaluate those features. For example, MoEs can be used to create a translation system, with each expert cluster learning to handle a separate part of speech or special grammatical rule.

Sponsored
Jan 14, 2022 | Venture Beat
4 principles for responsible AI
Jan 13, 2022 | Venture Beat
The future of robotics
Jan 12, 2022 | Venture Beat
Freedom and guardrails for Citizen X
Sustainable safaris in South Luangwa
4 hours ago | A Luxury Travel Blog
Travel
12 of the best places to propose in Italy
4 hours ago | A Luxury Travel Blog
Travel
Terrace Cemetery in Wendover, Utah
8 hours ago | Atlas Obscura
Travel