Translate long-form text files through a local Ollama-powered desktop and web app.
Book Translator provides a two-stage workflow for translating books and large documents: first it generates a draft translation, then it runs a second pass to improve fluency, consistency, and style. The project targets users who want a local-first interface, progress tracking, saved history, and downloadable output without building a custom prompt pipeline around Ollama.
- Two-stage translation refinement
- Local Ollama model execution
- Flask API and browser UI
- Desktop launcher with tray mode
- Translation history, cache, and export
The repository ships a modular Python application centered around the book_translator package. A Flask backend handles uploads, translation jobs, Ollama integration, metrics, and downloads, while the static frontend provides the browser UI. The current main branch includes tests, Docker support, Windows packaging files, and a desktop launcher via run.py.
Long-document translation tends to fail at the workflow level before it fails at the model level: chapters need chunking, retries, continuity, visibility into progress, and some way to recover usable output from imperfect generations. This project closes that gap with an opinionated local app instead of a collection of scripts. The two-stage pipeline is the core product idea: do not stop at the first draft when a second review pass can improve readability and consistency.
run.pystarts the local app, supports tray mode when optional desktop dependencies are installed, and opens the browser UI.book_translator/app.pybuilds the Flask app, registers blueprints, configures CORS, and serves the frontend.book_translator/api/routes.pyexposes translation, models, health, cache, download, and log endpoints.book_translator/services/translator.pyimplements chunking, stage-one translation, stage-two refinement, and cache-aware execution.book_translator/databasestores jobs and chunk metadata in SQLite.testscovers API endpoints, config, and core translation behavior.
- Python 3.10+
- Ollama running locally
- At least one installed Ollama model
-
Clone the repository and install dependencies.
git clone https://github.com/KazKozDev/book-translator.git cd book-translator pip install -r requirements.txt -
Pull an Ollama model.
ollama pull translategemma:12b
-
Start the application.
python run.py
-
Open http://localhost:5001 if it does not open automatically.
- Start the app.
- Choose source language, target language, and model.
- Upload a
.txtfile. - Monitor progress in the UI, then download the translated result.
If the optional desktop dependencies are unavailable, run.py falls back to a plain Flask launch and still opens the browser automatically.
The repository includes PyInstaller specs for Windows packaging:
pyinstaller book_translator_onefile.specpytest -q- The codebase is modular and covered by automated tests.
- The app supports local translation workflows, cache management, metrics, and downloadable text output.
- The README in earlier revisions overstated some capabilities; this version reflects the repository as it exists on the current
mainbranch.
MIT - see LICENSE
If you like this project, please give it a star ⭐
For questions, feedback, or support, reach out to:

