In an October 4, 2024 paper, researchers from Johns Hopkins University and Microsoft, introduced X-ALMA, a large language model (LLM)-based multilingual translation model that delivers “top-tier performance” across 50 languages, regardless of resource availability.
While many multilingual LLMs attempt to support hundreds of languages, they often struggle to maintain quality, especially for mid- and low-resource languages, where “their performance […] falls short of practical application expectations,” the researchers explained. (Slator)