Machine Translation

  • Home
  • Machine Translation

The type of machine translation provided by Google, DeepL, Microsoft, Amazon and others has seen significant quality improvement since the adoption of neural network technology in 2017.

The translations offered by those systems are based on very large multilingual datasets, mostly publicly available data, which is of variable quality, and from across many very different subject areas where terms and phrases may mean very different things.

So you can get some very good translations, but you can also get very bad or out-of-context translations.

That variable quality is what leads most people to edit the machine translations using native human editors. That’s also our approach with our AI-assisted machine translation service. 

However, to create value for our clients we have invested in the development and curation of several high-quality custom data layers; large datasets of very high quality translations in the top world languages, in the subject areas most related to our clients.

We use these custom data layers to to customise the neural machine translation output to a much higher level of quality versus the above-mentioned stock engines. 

This means we can offer better quality raw MT out of the box, and faster and less costly editing tasks for our linguists, meaning more competitive rates and faster turnaround times for you.