Translating Aotearoa



C-3PO, Taste the Translation, and other news

Will Skype Translator – or Star Wars’ C-3PO – replace translators? Barb Darrow of Fortune seems to think so. What do you think?

Ever wondered how to compete with free translation services or C-3PO? A translation company decided to challenge Google Translate with your tastebuds. ElaN’s Taste the Translation shows you the value of human expertise. David Griner of Adweek reports. Continue reading “C-3PO, Taste the Translation, and other news”

Measuring quality in translation

As you know, our aim is to provide our clients with quality translations. To that end we have put in place a number of processes in line with the European standard EN 15038 for translation services. While processes are useful and may serve as safeguards, our translations will only be as good as our translators, which is why measuring the quality of your translations is highly important.

Measuring quality in translation objectively is a difficult task. Research soon revealed that there is no consensus in that area and that many different quality assessment systems are used throughout the world. Some were created to grade exams and are highly elaborate – the quality metric developed by the American Translators Association identifies 23 categories of errors and offers 5 levels ofseverity; others were developed to be used in specific industries, such as the SAE quality metric for the automotive industry. Translation project management systems now come with their own quality assessment features which allow language service providers to keep track of the quality provided by translators.

Green tick with the text Quality control Approved.Since we didn’t have a ready-to-use solution, we decided to create a simple assessment metric based on the Canadian quality assessment model Sical, and we now use it in our recruitment process when we ask existing members of our panel of translators to assess test translations. Our metric identifies 3 different types of errors (translation, language and compliance errors) and 2 levels of severity. It also allows the assessor to grade the overall ‘naturalness’ of the translation. For test assessments, we ask that the assessor mark up and explain all errors.

We have now decided to unroll our quality assessment metric to all translations. Revisers will be asked to fill out a simple form at the end of each job and indicate the number of each type of errors as well as the overall ‘naturalness’ grade. Comments are welcome too. The assessment form along with the revised translation will be sent to the original translator for feedback purposes, and we will enter the information into a purpose-built application which will keep track of the quality of translations, and help us make informed decisions and identify training needs.

This is an exciting development for us, as we will have better control over the quality of our translations. It will also allow us to help you strengthen your skills and support you in your professional development.

Feel free to write us an email if you have any questions regarding our on-going quality assessment system.


The secret lives of revisers

Do you know exactly what a reviser does or should do? Many of you carry out revision tasks for us, as our translation process includes a revision stage in accordance with the EN 15038 standard for translation services. But do you know exactly what revision entails?

We can start by stating what it isn’t:

  • revision is not a retranslation, i.e. you should not do the translation all over again;
  • revision is not proofreading, i.e. you should not only read the translation and make sure that it sounds good.

Quality approved tickAs a reviser, your task is to compare the translation to the original text to make sure that the translator understood the source text correctly and transferred its meaning adequately into the target language. This means that you are responsible for making sure:

  • that everything has been translated – no omissions should be left unmarked;
  • that the terminology has been properly researched and used;
  • that there is no spelling or punctuation error;
  • that numbers have been transcribed accurately – for example, the number 1,250.30 in English is 1 250,30 in French;
  • that the formatting of the translation reflects that of the original;
  • that the tone and style of the translation match those of the original text and are appropriate for the intended readership.

In doing so, a reviser needs to respect the original translator’s work and style, and accept that the same meaning may be expressed in different ways. For example, ‘I slept through my alarm clock this morning’ and ‘My alarm clock didn’t wake me up this morning’ express the same idea; both options would be acceptable.

In that regard, our personal stylistic preferences are irrelevant. The role of a reviser is to eliminate errors and by doing this, improve the quality of the translation. If the text submitted to you doesn’t contain any errors, then you shouldn’t make any corrections. You can however make rephrasing suggestions if you think that they would significantly improve the quality of the translation.

From a practical point of view, please use track changes to mark your corrections (if you don’t know how to use track changes, read this page), and use the comment function to make your suggestions (learn how to use comments here).

You may also find it helpful to use the error categories mentioned in the article on assessing translations (translation and language errors), as well as a third type of errors: compliance errors – these relate to the non-adherence to instructions, style guides, required format etc; a blatant disregard of instructions would qualify as a major compliance error, while a slight deviation from the instructions given is a minor one.

Thinking of revision in those terms may help you distance yourself from your personal preferences and focus on errors per se. As mentioned earlier, it shouldn’t prevent you from suggesting important improvements – you only need to be able to distinguish between improvements and corrections.

Finally, if you believe that the quality of a translation is too poor to be revised – but that should only happen on rare occasions, shouldn’t it? – let us know and give some examples. Do not start translation afresh without being instructed to do so by the project manager.

We are in the process of developing an ongoing quality assessment system, which should be implemented by the end of the year at the latest. This will involve a few changes in the way we do revision, but the basic concept of revision will remain. Compulsory training in that area will also be provided.

The key to translation assessments

Many of you will be familiar with the way we recruit new translators. Anyone who is interested in applying to become one of our panel translators needs to fill out an application form and do a translation test. When possible, we will ask two of our existing panel translators Images of a checklist showing ticks and crosses(i.e. you) to assess the translation and help us decide whether that person is a good translator and should be added to our panel.

To a certain extent, the assessment process is very similar to a revision (see the article on revision in this issue). Both tasks require you to compare the target and source texts, and make sure that the meaning has been translated correctly. While a reviser makes corrections and suggestions, an assessor only makes comments to mark up errors and explain why they are errors (if you aren’t sure how to use comments, read this page).

To help you with this task, we recently established a new quality assessment system and defined two types of mistakes:

  • Translation errors: these are related to the transfer of meaning. They may be omissions, additions, mistranslations etc. – the rendered meaning is different to the original;
  • Language errors: they relate to the language used in the target text, i.e. spelling mistakes, improper syntax, inadequate language level etc.

There are two severity levels: errors may be either minor or major. For example, the colour of a car in a short description in a novel may not be a major piece of information to the reader – if the car is red in the translation when it is burgundy in the original, it won’t be of great consequence, and would normally be considered minor; however, in a theft report to the police, the colour of the car is an essential element and any mistake in that regard would be major. Another example is punctuation. While it may generally be considered a minor issue, in a sentence like ‘Let’s eat Grandma!’, the lack of a comma determines what will be served for dinner…

Your comments should contain an error code, as well as a note explaining why this should be considered as an error (in English).

 Error type  Translation  Language
 Severity  Major   Minor   Major   Minor 
 Code  MT  mt  ML  ml

For example:

‘A grey, blue-eyed cat jumped into his lap and started puring.’


  • his – MT: wrong possessive pronoun. The character is female
  • puring – ML: spelling mistake. This should be written ‘purring’.

Here’s an example of what you should not do:

A grey, blue-eyed cat jumped into his lap and started puring.’


  • A grey, blue-eyed cat – The cat was grey and had blue eyes and jumped into her lap.
  • his – her
  • puring – purring

Two general questions round up the assessment process:

  1. What is the intended purpose of the original text? Can the target text be used for that purpose?
  2. On a scale of 1 to 5 (with 1 = doesn’t sound natural at all; 5 = well written and sounds as if it had been written by a native speaker of the target language), how natural does the translation sound?

As you can imagine, the assessment process is very important. If we ask you to do one, it means that we trust you to give a fair and informed assessment of a test translation that we can then send back to the applicant, so that he/she may become aware of their strengths and weaknesses. As with a revision, you should only focus on errors, and accept that different translators have different styles.

Blog at

Up ↑