Skip to content

KazKozDev/book-translator

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

62 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Book Translator logo

Book Translator

Translate long-form text files through a local Ollama-powered desktop and web app.

CI status MIT license Python 3.10+ Ollama local models Platforms

Book Translator provides a two-stage workflow for translating books and large documents: first it generates a draft translation, then it runs a second pass to improve fluency, consistency, and style. The project targets users who want a local-first interface, progress tracking, saved history, and downloadable output without building a custom prompt pipeline around Ollama.

Highlights

  • Two-stage translation refinement
  • Local Ollama model execution
  • Flask API and browser UI
  • Desktop launcher with tray mode
  • Translation history, cache, and export

Demo

Book Translator screenshot

Overview

The repository ships a modular Python application centered around the book_translator package. A Flask backend handles uploads, translation jobs, Ollama integration, metrics, and downloads, while the static frontend provides the browser UI. The current main branch includes tests, Docker support, Windows packaging files, and a desktop launcher via run.py.

Motivation

Long-document translation tends to fail at the workflow level before it fails at the model level: chapters need chunking, retries, continuity, visibility into progress, and some way to recover usable output from imperfect generations. This project closes that gap with an opinionated local app instead of a collection of scripts. The two-stage pipeline is the core product idea: do not stop at the first draft when a second review pass can improve readability and consistency.

Architecture

Requirements

  • Python 3.10+
  • Ollama running locally
  • At least one installed Ollama model

Quick Start

  1. Clone the repository and install dependencies.

    git clone https://github.com/KazKozDev/book-translator.git
    cd book-translator
    pip install -r requirements.txt
  2. Pull an Ollama model.

    ollama pull translategemma:12b
  3. Start the application.

    python run.py
  4. Open http://localhost:5001 if it does not open automatically.

Usage

Translate a document

  1. Start the app.
  2. Choose source language, target language, and model.
  3. Upload a .txt file.
  4. Monitor progress in the UI, then download the translated result.

Run the web app without tray mode

If the optional desktop dependencies are unavailable, run.py falls back to a plain Flask launch and still opens the browser automatically.

Build desktop binaries

The repository includes PyInstaller specs for Windows packaging:

pyinstaller book_translator_onefile.spec

Run tests

pytest -q

Current State

  • The codebase is modular and covered by automated tests.
  • The app supports local translation workflows, cache management, metrics, and downloadable text output.
  • The README in earlier revisions overstated some capabilities; this version reflects the repository as it exists on the current main branch.

MIT - see LICENSE

If you like this project, please give it a star ⭐

For questions, feedback, or support, reach out to:

LinkedIn Email

About

Translate books and big text files via Ollama LLMs using two-stage approach with initial translation, then AI self-reflection and refinement.

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Contributors