FOR TRANSLATORS: It is essential that translators check their translation performance 
in Oneforma to learn from their errors and let us know in case they disagree with some of the 
received feedbacks. From the “My Tasks” page in Oneforma, go to the webapp you want to 
check the feedback for and click Actions > View performance. The Performance page 
opens. In the upper part called My Performance, you can find: 
• Quality score: the overall quality score you got for all the work in that webapp 
• Translations Reviewed: the number of your translations that have been reviewed in 
that specific webapp 
In the lower part, called Translation feedback, you can find a table with the list of all the strings where reviewers have found mistakes and/or made edits.
You can use this to compare your original translation with the one submitted by the reviewer. 
You can also see the error category that has been assigned to edit. 
-If you agree with the reviewer, learn from the feedback and click the Agree button. 
-If you disagree with the correction made by the reviewer, click Disagree. Please remember 
that you have 14 days to disagree with an edit or error category. Once these 14 days have 
passed, you will not be able to Disagree with that string or provide counter-feedback, so 
please remember to check this section regularly. 
If you disagree, a text box will open where you can send us a comment and explain why you 
disagree with the reviewer. Click Send counterfeedback to share the comment with the 
Centific team. This will be arbitrated by a third party who will decide if the error should be 
removed from your report or not. NOTE: make sure your comments in the counter-feedback 
box are professional and polite, precise and concise. 
If you disagree, a text box will open where you can send us a comment and explain why you 
disagree with the reviewer. Click Send counterfeedback to share the comment with the 
Centific team. This will be arbitrated by a third party who will decide if the error should be 
removed from your report or not. NOTE: make sure your comments in the counter-feedback 
box are professional and polite, precise and concise. 
NOTE: “Preferential Edit” DOES NOT penalize the QA score. It has no impact on your QA 
result. It is just an alternative translation for your reference. If both translations are equally 
correct, click Agree. If there is a mistake in the review, click Disagree and let us know what 
the error in the review is, so we can penalize the reviewer if the arbitrator agrees.  
If the arbitrator agrees with the reviewer and the error is confirmed, you will see a comment 
with the final arbitration under the “comment from arbitrator” column. If on the contrary we 
agree with you and decide to remove the error from your report, it will disappear from the list. 
Score will always be updated automatically every time an arbitration is done. 
FOR REVIEWERS: It is essential that reviewers check their review performance in 
Oneforma to learn from their errors and see in which cases they have unfairly penalized a 
translator or performed the review wrongly. This is extremely important because reviewers 
are the last step before the delivery to the customer and we need to make sure only qualified 
reviewers are working in the QA webapps. The scores applied to the reviewers come from 
the strings that have been arbitrated. 
From the “My Tasks” page in Oneforma, go to the webapp you want to check the QA 
feedback for and click QA Actions > QA View performance. The Performance page opens. 
In the upper part called My Performance, you can find: 
• Quality score: the overall quality score you got for all the work in that webapp 
• Translations Reviewed: the number of your translations that have been reviewed in 
that specific webapp 
In the lower part, called Translation feedback, you can find a table with the list of all the 
strings where arbitrators have found mistakes in strings you reviewed and/or where you 
misjudged and assigned the wrong error category to a translator.  
• “Error category from you to translator” is the error category you originally 
assigned to the translator and for which the translator complained. 
• “Error category from arbitrator to translator” is the error category the arbitrator 
decided to assign to the translator after the complaint about your review. If this 
doesn’t match what appears in the previous column, it means the arbitrator thought 
your review was not fair or the category you used was not correct. 
• “Error category from arbitrator to you” is the error category the arbitrator assigned 
to you after evaluating the counter-feedback. This can be because you did a mistake 
in the reviewed string or because you unfairly penalized the translator in the first 
place. 
• “Counterfeedback” contains the original complaint we received from the translator. 
• “Arbitration” is the final decision made by the arbitrator 
BACK TO ⇒ MACHINE TRANSLATION POSTEDITING (MTPE) TRANSLATION v.3.1