How Google Translate works
Some facts about Google Translate and its neural network
Google Translate is considered to be the world`s # 1 machine translator. The service supports 103 languages and processes about 500 million requests every day.
In 2016, Google introduced Neural Machine Translation (GNMT), which uses an artificial neural network to improve translation quality.
Has translation really gotten better with it? Let`s find out!
The intricacies of neural translation: how it works
The neural machine translation model uses different principles of working with text than the standard statistical translation method.
Before the advent of neural networks, translation was carried out word by word - the system translated individual words and phrases taking into account grammar. Therefore, with complex phrases or long sentences, the quality of the translation left much to be desired.
GNMT translates the entire sentence, taking into account the context. The system does not remember hundreds of phrases translation options - it operates with the semantics of the text.
When translated, a sentence is split into vocabulary segments. Then, using special decoders, the system determines the "weight" of each segment in the text. Next, the maximum probable values and the translation of the segments are calculated. The last step is to connect the translated segments, taking into account the grammar.
How the translator algorithm works
To understand how Google Neural Translation works (another example is https://gglot.com/), let`s dive a little deeper into the technical details.
Google Neural Machine Translation is based on how Bidirectional Recurrent Neural Networks (Bidirectional Recurrent Neural Networks) work with matrix probability calculations.
Let`s take a closer look at what all this means.
Recurrent says that the system calculates the meaning of a word or phrase based on previous values in the sequence. This is what allows the system to take into account the context and correctly choose among different translation options.
For example, in the phrase "mahogany bow" the system will translate the word "bow" as "bow", not "onion".
Bidirectionality means that the neural network is divided into two streams - analyzing and synthesizing. Each stream consists of eight layers that perform vector analysis.
The first stream breaks the sentence into semantic elements and analyzes them, while the second calculates the most likely translation option based on the context and attention modules. Read more for reference: Wiki