Facebook has introduced the first AI model for keep the people communication by translating to other 100 language all over the work. The AI model features as:
- M2M-100, the first multilingual machine translation (MMT) model that can translate between any pair of 100 languages without relying on English info, is being launched by Facebook AI. It is open source, open source.
- Most English-centered multilingual models train from Chinese to English when translating, say, Chinese to French and English to French, since English training data is the most commonly accessible. To better preserve sense, our model directly trains on Chinese to French info. On the commonly used BLEU metric for testing computer translations, it outperforms English-centric systems by 10 points.
- On a total of 2,200 language directions, M2M-100 is educated, or 10x more than previous best, English-centric multilingual models. For billions of people, especially those who speak low-resource languages, deploying M2M-100 will improve the quality of translations.
- This accomplishment is the culmination of the foundational work of Facebook AI in machine translation for years. Today, we share information on how we have developed a more diverse data set and model for 100 languages for MMT training.
Visiting the source of the information