Pretraining
The initial phase of training a machine learning model on large datasets before adapting it to specific tasks.
Definition
The initial phase of training a machine learning model on large datasets before adapting it to specific tasks.
How It Works
Pretraining helps teams build predictable AI and translation workflows by setting clear expectations for quality, consistency, and decision-making.
In production environments, this concept is applied with process controls such as human review, terminology alignment, and repeatable quality checks across multilingual content.
Key Concepts
- core principle of pretraining
- workflow-level implementation
- terminology and quality consistency
- human validation before publication
Where It Is Used
- localisation workflows
- AI translation pipelines
- multilingual content production
- cross-referencing related concepts such as Parallel Corpus