Structured JSON data for Deutsche Post shipping services
A comprehensive, schema-validated dataset containing Deutsche Post pricing, restrictions, zones, features, services, and compliance frameworks for international postal services with complete JSON schema validation and automated quality assurance.
This dataset is perfect for:
- E-commerce platforms - Calculate shipping costs and restrictions
- Logistics software - Integrate Deutsche Post pricing and rules
- Compliance tools - Check shipping restrictions and sanctions
- Research projects - Analyze postal service patterns and policies
- Educational purposes - Learn about international shipping regulations
- 9 JSON files with comprehensive postal data
- 9 JSON schemas ensuring data integrity
- 100+ countries covered in shipping zones
- 5 product types (letters, merchandise)
- 3 active service types (registered mail, insurance) - 2 services discontinued as of 2025-01-01
- 31 restrictions with 19 compliance frameworks
| File | Description |
|---|---|
products.json |
Shipping products (letters, parcels, packages) |
services.json |
Additional services (registered mail, insurance, etc.) |
prices.json |
Pricing tables by product, zone, and weight (with effective dates) |
zones.json |
Geographic zones and country mappings |
weight_tiers.json |
Weight brackets for pricing |
dimensions.json |
Size limits and specifications |
features.json |
Service features and capabilities |
restrictions.json |
Shipping restrictions, sanctions, compliance frameworks |
data_links.json |
Cross-references between data files |
All data is validated against JSON schemas in the schemas/ directory.
- Python 3.11+
- Git
git clone https://github.com/gruncellka/porto-data.git
cd porto-data
make setupThis installs:
portoCLI for validation and metadata commandsjsonschemafor validationruff,mypyfor code quality (Ruff handles formatting + linting)pytest,pytest-covfor testing (87% coverage)pre-commitframework with hooks (installed automatically)
Note: Pre-commit hooks are automatically installed during make setup and will run automatically on every commit.
make validate # Validate all JSON files
make quality # Run all quality checksProduct (e.g., "letter_standard")
├─ has dimension_ids → dimensions.json
├─ has weight_tier → weight_tiers.json
└─ has prices in zones → prices.json
├─ price array with effective_from/effective_to dates
└─ references zones.json
Service (e.g., "einschreiben_einwurf")
├─ has features → features.json
├─ has coverage (in cents)
└─ applies to products → products.json
Restriction (e.g., "YEMEN_2015")
├─ country_code: "YE"
├─ region_code: null
└─ framework_id → compliance_frameworks
└─ Legal basis (conflict zones, operational policies)
All relationships are one-directional (no circular dependencies).
data_links.json provides links about data file relationships:
- Dependencies - Which files depend on which (e.g., products depend on zones, weight_tiers, dimensions)
- Links - Product-to-zone-to-weight-tier mappings for fast lookups (e.g., which zones and weight tiers each product supports)
- Lookup rules - How to find prices, services, and resolve weights
- Global settings - Available services and price lookup configuration
This metadata is primarily used by SDKs for optimized data access and validation, but is also useful for understanding the data structure.
Validation: The porto validate --type links command ensures consistency between data_links.json and the actual data files. It checks:
- ✅ All products, zones, and weight tiers in links exist in their respective files
- ✅ Product zones and weight tiers match between
data_links.jsonandproducts.json - ✅ Prices exist for all zone+weight_tier combinations
- ✅ Available services are valid and have prices
- ✅ Lookup method configuration matches actual file structure
- ✅ Unit values (weight, dimension, price, currency) are consistent across files
- ✅ All data files are covered in dependencies section
- ✅ No circular dependencies between files
Run validation with:
porto validate --type links # Quick validation (CI/CD friendly)
porto validate --type links --analyze # Detailed analysisAll data files are in the data/ directory. Open any .json file to view the data.
# 1. Edit JSON files
vim data/products.json
# 2. Validate your changes
make validate
# 3. Format your changes
make format
# 4. Commit
git add .
git commit -m "feat: update products"
# → Pre-commit hooks run automatically and validate everything
# → If metadata.json is regenerated, commit will be rejected
# → Stage metadata.json and commit againmake validate # Validate JSON against schemas
make lint-json # Check JSON syntax only
make format-json # Auto-format JSON filesThe pre-commit framework automatically runs hooks on every commit:
- ✅ Formats all JSON and Python files (auto-staged)
- ✅ Validates JSON syntax
- ✅ Validates against schemas
- ✅ Runs Python linting and type checking
- ✅ Regenerates metadata.json if data files changed
Important behaviors:
- ✅ Modified files are automatically staged (fixes uncommitted changes bug)
- ❌ If
metadata.jsonis regenerated but not staged, commit is rejected - you must stagemetadata.jsonin the same commit - ❌ If validation fails, commit is blocked until you fix the errors
Installing hooks:
make install-hooks # Reinstall pre-commit hooks (usually not needed)# CLI Commands (porto)
porto validate # Validate everything (default)
porto validate --type schema # Validate JSON against schemas
porto validate --type links # Validate data_links.json consistency
porto validate --type links --analyze # Detailed links analysis
porto metadata # Generate metadata.json
# Make Commands
make validate # Validate all JSON files
make validate-data-links # Validate data_links.json consistency
# Formatting
make format # Format JSON and Python
make format-json # Format JSON only
make format-code # Format Python only
# Linting
make lint # Lint JSON and Python
make lint-json # Lint JSON only
make lint-code # Lint Python only
# Quality
make quality # Run all checks (format, lint, validate, type-check)
# Testing
make test # Run all tests
make test-cov # Run tests with coverage (80% threshold)
# Metadata
make metadata # Generate metadata.json with checksums
# Help
make help # Show all commandsThe metadata.json file is automatically generated with checksums of all data and schema files. It's regenerated when:
- Any file in
data/orschemas/changes - Checksums don't match the current files
Structure:
- Grouped by entity name (e.g.,
products,services) with data and schema files linked together - Includes canonical schema URLs from
$idproperties - Each data file includes
$schemaproperty pointing to its schema URL
Commit behavior:
- If
metadata.jsonis regenerated during commit, the commit is rejected ifmetadata.jsonis not staged - You must stage
metadata.jsonin the same commit as your data changes:git add metadata.json - This ensures
metadata.jsonstays in sync with data files in the same commit
Schema-to-data file mappings are defined in mappings.json (source of truth). All data files include a $schema property with the canonical schema URL for validation and editor support.
- Country codes: ISO 3166-1 alpha-2 (
DE,US,FR,YE) - Region codes: ISO 3166-2 (
DE-BY,US-CA,FR-75) - Dates: ISO 8601 (
2024-01-15,2023-06-01)
EU- European UnionUN- United NationsDE- Germany (national)DP- Deutsche Post (operational)
- ✅ Tracks occupied/disputed territories
- ✅ Links to legal frameworks (EU sanctions, UN resolutions)
- ✅ Supports partial territory restrictions (
effective_partial) - ✅ Historical effective dates (
effective_from,effective_to)
All JSON files are validated against schemas to ensure:
- ✅ Required fields are present
- ✅ Data types are correct (string, number, boolean, array)
- ✅ Values match allowed enums
- ✅ ISO codes follow correct patterns
- ✅ Dates are properly formatted
- ✅ Cross-references are valid
make validateporto-data/
├── data/ # Main data files (JSON)
│ ├── products.json # Includes $schema property
│ ├── services.json
│ ├── prices.json
│ ├── zones.json
│ ├── weight_tiers.json
│ ├── dimensions.json
│ ├── restrictions.json
│ ├── features.json
│ └── data_links.json
├── schemas/ # JSON schemas for validation
│ ├── products.schema.json
│ ├── services.schema.json
│ └── ...
├── cli/ # CLI commands (porto)
│ ├── main.py # Entry point
│ └── commands/ # Subcommands (validate, metadata)
├── scripts/ # Core validation logic
│ ├── validators/ # Validation modules
│ │ ├── schema.py # JSON schema validation
│ │ └── links.py # Data links validation
│ ├── data_files.py # File mappings from mappings.json
│ ├── utils.py # Checksum utilities
│ └── generate_metadata.py
├── tests/ # Test suite (115 tests, 87% coverage)
├── resources/ # Original source files (PPL CSV, etc.)
│ └── ppl/ # Deutsche Post price list files
├── .pre-commit-config.yaml # Pre-commit framework configuration
├── Makefile # Build automation
├── pyproject.toml # Python dependencies & config
├── mappings.json # Schema-to-data mappings (source of truth)
└── metadata.json # Generated checksums (auto-generated)
- 4-space indentation
- Keys are kept in original order (not sorted)
- Arrays are multi-line for readability
- Uses Python's built-in
json.tool
- Ruff (formatting, linting, and auto-fixes - line length: 100)
- MyPy (type checking)
- Fork the repository
- Run
make setupto install dependencies and hooks - Make your changes
- Run
make qualityto validate - Commit (pre-commit hooks validate automatically)
- If
metadata.jsonwas regenerated, stage it:git add metadata.jsonand commit again - Submit a pull request
- Add your data to the appropriate JSON file
- Ensure it follows the schema
- Run
make validateto check - Run
make formatto auto-format - Commit your changes
- If
metadata.jsonwas regenerated, stage it:git add metadata.jsonand commit again
- Edit the schema file in
schemas/ - Update corresponding data in
data/ - Run
make validateto verify compatibility
This project is licensed under the MIT License - see the LICENSE file for details.
This is reference data for Deutsche Post services. Always verify current restrictions, pricing, and service availability with Deutsche Post before shipping.
Data accuracy is maintained on a best-effort basis. For official information, visit:
- Deutsche Post: https://www.deutschepost.de
For questions, issues, or contributions:
- 📧 E-Mail: [email protected]
- 📧 Issues: Open a GitHub issue
- 🔧 Contributions: Submit a pull request
- 📖 Documentation: Check this README and inline comments
🔳 gruncellka