Facebook Publishes New Neural Machine Translation Algorithm

Facebook’s Artificial Intelligence Research team published research results using a new approach for neural machine translation (NMT). Their algorithm scores higher than any other system on three established machine translation tasks, and runs nine times faster than Google’s NMT system.

Facebook’s technique uses convolutional neural networks, a technique popular in the field of computer vision. This technique processes sentences in a hierarchical order; this way it captures complex relations occurring in a sentence. Facebook trains the computer to give meaning to parts of a sentence (consisting of two, three, four, or even more words next to each other). By processing the sentence with these networks the computer gets a notion of what every part of the sentence means. A different neural network turns this representation of meaning back into another language.

The main advantage of the convolutional method is that you can apply it on multiple parts of a sentence at the same time. Traditional NMT methods read a sentence word by word, and remember what the sentence meant up to that point. The speed of the computer throttles the sequential reading speed. And the result is that Facebook’s algorithm is up to nine times faster than the sequential reading methods.

They also introduced a new technique called “multi-hop.” Instead of reading the whole sentence and then writing the whole translated sentence, the network chooses what words from the original text to focus on while translating word by word. Multi-hop is a new technique that provides smarter and more complicated alternatives to the “attention” mechanisms. Attention mechanisms are the key to the “multiple meanings of words” problem. Based on the context of a word, a word has different translations. Attention mechanisms solve this problem by, while translating a word, focusing on relevant parts of the source sentence to determine a good translation.

Facebook plans to use their new approach for…

Read the full article at the Original Source..

Back to Top