LLM Pipeline
Overview
Section titled “Overview”Coming soon — this page will explain how LokAI orchestrates LLM translation jobs, including the job lifecycle, retry and backoff strategy, bounded concurrency, and how glossaries and style guides are injected into every prompt.
Topics covered
Section titled “Topics covered”- Job creation and the
llm_jobsqueue - Provider abstraction (OpenAI, Anthropic)
- Prompt construction — glossary and style guide injection
- Confidence scoring and uncertainty handling
- Retry, backoff, and cancellation (Effect-TS)
- Cascade re-translation when source text changes
- Usage tracking and cost accounting