Improve translation quality with the Linguistic Quality Assurance (LQA) app
The Linguistic Quality Assurance (LQA) app will help you measure the translation quality and get more insights into the translation team's performance.
The app is free and supports industry-standard translation quality metrics (or models) such as TAUS DQF-MQM, LISA, and SAE J2450. You can edit the existing models or create your own from scratch.
The app helps proofreaders while reviewing the translations categorize the mistakes and inconsistencies depending on the model chosen. For project managers, these data are summarized in the reports that help evaluate the translation process's efficiency.
LQA app can only be installed on Crowdin Enterprise.
New Features
AI Annotations: To get instant annotations, just configure your LQA model.
Multi-Select Project Creation: Improve your workflow by creating LQA projects with multi-selection from your Crowdin projects.
Batch Report Generation: Generate reports for multiple projects simultaneously, saving you time and effort.
Direct Email Link Access: We've added convenient links next to the language list within the app's project information section. These links correspond to important emails (like "LQA review finished," "LQA Arbitration Request," and "Your opinion needed on LQA results"), allowing you to navigate directly from the app.
Flexible Email Notifications: Beyond the project owner, you can now designate additional email addresses to receive "LQA review finished" notifications, ensuring key stakeholders are always informed.
Quick Demo
Setting Up the Linguistic Quality Assurance (LQA) App
1. Install the App
Go to the Crowdin Store and search for the Linguistic Quality Assurance (LQA) app.
Click Install.
In the installation dialog, choose who will have access to the app and in which projects it will be available.
Complete the installation process.
After installation, the app will appear in your organization’s Workspace (left-hand panel).
2. Configure AI Prompt (If Using Annotations)
In your organization’s Workspace, go to the AI section and open the Prompts tab.
After installing the LQA app, a default Linguistic Quality Assurance annotations prompt (custom:lqa:app type) is added automatically with the Disabled status.
To use this prompt, click it, review the settings, and make any necessary changes (e.g., provider, model, or prompt text).
Enable the prompt by clicking Enable in the upper-right corner of the prompt editor.
Return to the LQA > Models tab, open the desired model, and enable AI Annotations. Then select the prompt from the list.
Note: AI annotations in the Editor will only become available if a custom:lqa:app prompt is enabled and selected in the model settings.
3. Create or Customize an LQA Model
In your organization’s Workspace, go to LQA > Models.
Choose one of the built-in models (e.g., TAUS DQF-MQM, LISA, or SAE J2450) or click Create Model to build one from scratch.
To customize an existing model, click Clone and adjust its configuration: Add or remove error categories and subcategories; Set severity levels (critical, major, minor) and define penalty weights; Enable AI Annotations (optional) and select the desired prompt; Require comments when reviewers report an issue (optional); Enable translator response and arbitration if needed; Set tolerance – the number of errors allowed per 1,000 words.
Click Save and give your model a clear, descriptive name.
The saved model will become available for selection when creating an LQA project.
4. Create an LQA Project
Go to your organization's Workspace > LQA and open the Projects tab.
Click Add project.
In the New LQA project dialog, choose a Crowdin project, select the type (Rating or Model), and pick a quality model (if applicable).
Click Add to create the LQA project.
Once the project is added, it will appear in the Projects tab list. Click the project name to expand its details. Use the toggles in the Status column to enable LQA for the needed target languages.
Assign reviewers for each enabled language via the project’s Members section. Read more about inviting people to a project.
5. Review and Annotate Translations (Proofreader Flow)
Proofreaders can annotate translations directly in the Editor using the LQA app panel.
To review and annotate translations:
In the Editor, add the corrected translation directly to the string.
Open the LQA app from the right-side panel.
In the Create Annotation section, select the corrected translation (the version you just added).
Select the translation you want to rate (the original version you’re replacing).
Select the text segment you want to annotate or click on the highlighted (colored) parts of the translation to report an issue. A Quality Issue dialog will appear where you can: 1. Choose the Mistake Type and Mistake Severity; 2. Optionally review or adjust the Sub-segment (pre-filled based on your selection); 3. Add a Comment
Click OK to save the annotation.
The annotation will then appear in the Existing Annotations section for the selected string.
Repeat the process for each translation that requires review.
Note: When finished with all strings in the language, click Finish Language in the LQA panel to mark it as complete.
6. Review Issues (Translator and Arbiter Flow – if configured)
If your LQA model includes a translator response or arbitration, the review continues after the initial proofreading.
Translator Response
Once a proofreader finishes a language, the assigned translator will be notified.
The translator opens the project in the Editor and views reported annotations in the LQA panel.
For each issue, the translator can: Accept the issue; Reject the issue and add a comment explaining why they disagree
All responses are recorded in the annotation history for that string.
Arbitration (if enabled)
If the translator rejects any issues, the system will notify the assigned arbiter.
The arbiter opens the same string in the Editor and reviews the full history, including the original issue, translator response, and any comments.
For each disputed issue, the arbiter makes a final decision: Confirm the original issue; Accept the translator’s argument
The outcome is stored in the final report and affects the overall score.
Note: The translator and arbiter flow only applies if these options are enabled in the selected model. Once an LQA project is created, you cannot switch the model or enable these steps retroactively.
7. Generate LQA Reports
To review the results of your LQA project, go to the Reports tab in the LQA app. Reports provide detailed information about the issues found during proofreading and help you assess overall translation quality.
You can generate a report for a specific project using the available filters:
Target Language – narrow the report by one or more target languages.
User – filter annotations by the user who created them.
Files – include only selected source files.
Labels – include strings with selected labels.
Report Template – optionally apply a saved template.
Date Range – required fields to define when the issues were created.
Once the filters are set, click Download Report to export the XLSX file or View Report to preview the results online.
To generate reports for multiple projects at once, click Download All Reports.
8. (Optional) Create and Use Report Templates
To streamline report generation, you can save filter settings as reusable templates.
Open the Templates tab in the LQA app.
Click Add Template to create a new one from scratch, or download the default template to use as a starting point.
Customize the template by defining filters (e.g., language, user, date range, etc.).
Saved templates will appear in the Report Template drop-down menu in the Reports tab.
Note: Templates are especially useful if you run recurring reports with consistent settings across multiple projects.
Screenshots
Localize your product with Crowdin
Automate content updates, boost team collaboration, and reach new markets faster.
Crowdin is a platform that helps you manage and translate content into different languages. Integrate Crowdin with your repo, CMS, or other systems. Source content is always up to date for your translators, and translated content is returned automatically.