Google Is Using Artificial Intelligence to Make a Huge Change to Its Translate Tool
Teaching machines to truly understand natural language has been one of the biggest challenges facing computer scientists working to advance artificial intelligence. But Google has made real progress in getting computers to look at language as more than just a bag of words, and these advancements are now making their way into its products.
Google Translate, for example, is getting a technical makeover with the introduction of Neural Machine Translation (NMT). Starting today, you’ll notice vast improvements for any translations with Hindi, Russian and Vietnamese. This follows the first go at utilizing NMT in Translate last November, when English, French, German, Spanish, Portuguese, Chinese, Japanese, Korean and Turkish all saw the same improvement.
“We have 103 languages overall, and our goal is to get all of them working with neural nets,” a Google spokesperson told the Observer. He added the rollout of the remaining languages will take place over many months, but exact timing is unknown because Google is simply launching each whenever it’s able to outperform the current system. Sometimes this will be a few at once, like with today’s introduction of improved Hindi, Russian and Vietnamese.
Google Translate has always been kind of useful but ehhh overall. You could use it to get a sense of what something means in another language, but anything more than a simple phrase wouldn’t be an accurate translation. But with this new approach, translations on Google Search, translate.google.com, Google apps and, eventually, automatic page translations in Chrome will be significantly better and finally reflect natural language.
“Neural translation is a lot better than our previous technology because we translate whole sentences at a time, instead of pieces of a sentence,” Barak Turovsky, product lead at Google Translate, wrote in a blog post announcing the news.
Previously, Google relied on Phrase Based Machine Translation (PBMT), which breaks an input sentence into words and phrases to be translated independently. The new NMT, however, considers the entire sentence as an input and translates it as one. NMT uses deep neural networks, which enable a computer to understand situations it hasn’t seen before by learning, overtime, from other information. In this case, some of the training sets the program uses to learn are data from the Google Translate Community, where everyday users from all over the world translate sentences from their own languages and even rate translations.
While all languages won’t make the switch for many months, the next batch is expected in a few weeks.
By Sage Lazzaro — first published March 6, 2017 on www.observer.com