This week Google announced that its translation app will add Neural Machine Translation (NMT) to eight of the languages it translates for. Maybe more importantly, it presented findings on “zero shot” translation, a method in which a single NMT template can be used to translate language pairs it has never encountered.
On Tuesday, November 15, 2016 Google announced it will use NMT to translate English to and from French, German, Spanish, Portuguese, Chinese, Japanese, Korean and Turkish. Google has lauded the development, saying its translation app is “improving more in a single leap than we’ve seen in the last 10 years combined.”
The tech company has done a lot of work with artificial intelligence as the machine translation debate continues to heat up in the language industry. NMT is said to dramatically increase translation quality, with Google reporting a strong uptick in accurate translations when the initial NMT announcement was made.
NMT has the ability to translate full sentences instead of phrases or chunks of text, and supersedes the abilities of Phrase Based Machine Translation. Eventually Google hopes to use NMT for all of the languages it translates for.
“Zero Shot” Translation
The fact that Google was going to jump on the NMT bandwagon isn’t new. But a claim Google researchers made in a papter released Monday draws more interest.
Researchers say Google’s translation model is able to translate multiple languages using a single NMT “architecture,” allowing for zero shot translation. Zero shot is able to translate for language pairs it has never encountered before.
Zero shot learning refers to computers recognizing or “learning” new concepts without previous knowledge of them.
“…a multilingual NMT model trained with Portuguese>English and English>Spanish examples can generate reasonable translation for Portuguese>Spanish although it has not seen any data for that language pair. We show that the quality of zero shot language pairs can easily be improved with little additional data of the language pair in question,” the paper states.
According to Google, the research indicates the “first demonstration of true multilingual zero-shot translation.”
The ability to base multiple language pairs off of a single NMT model means Google will be able to drastically cut down on the number of machine translation models it needs to create. Google Translate works in more than 100 languages and would theoretically have to create thousands of translation models if it weren’t for the single model system.
According to Google, the method improves the translation quality of “low-resource languages,” too. Low resource languages are those that don’t contain a good amount of reference material for translation.
A Reason to be Excited
According to a Google Blog post, the language pairs now available with NMT represent more than 35% of the app’s translation queries, and make up the native languages of around one third of the world’s population.
Thanks to the wider breadth of knowledge NMT generates by translating complete sentences instead of fragments, Google Translate is able to retrieve more relevant information and produce smoother translations, the company says.
There’s no question that Google is on the cutting edge when it comes to machine learning. And their findings on Zero Shot translation have made waves in the industry, showing how far machine translation has come and piquing interest in regards to what the future may hold.
Notwithstanding the fact that Google Translate is far from perfect, the news this week is another reason to be excited about language technology.
Follow United Language Group for the latest updates on the translation industry.