A comprehensive Python library for fitting hyperelastic and Mullins effect material models to experimental data.
HyperFit is a configuration-driven library designed for both direct Python usage and C++ integration via Pybind11. It provides robust, extensible fitting capabilities for hyperelastic material models commonly used in computational mechanics.
- Configuration-Driven: Complete fitting process controlled by a single configuration dictionary
- Modular Architecture: Easy to extend with new material models and optimization strategies
- Dual-Use API: Clean Python API with C++ bindings for integration
- Multiple Models: Support for Ogden, Polynomial, and Reduced Polynomial models
- Mullins Effect: Optional stress softening for filled elastomers
- Robust Optimization: Multiple optimization algorithms with stability controls
- Quality Metrics: Comprehensive fitting diagnostics and quality assessment
- Ogden Model: W = Σᵢ (2μᵢ/αᵢ²) * (λ₁^αᵢ + λ₂^αᵢ + λ₃^αᵢ - 3)
- Polynomial Model: W = Σᵢⱼ C_ij * (I₁ - 3)^i * (I₂ - 3)^j
- Reduced Polynomial Model: W = Σᵢ C_i0 * (I₁ - 3)^i
- PANN (Physics-Augmented Neural Network): Neural network model with guaranteed polyconvexity and physics constraints
- Mullins Effect: η = 1 - erf((W_max - W) / (m + β * W_max)) / r
# Install from source
git clone https://github.com/hyperfit/hyperfit.git
cd hyperfit
pip install -e .
# Or install from PyPI (when available)
pip install hyperfit# Build C++ bindings
cd cpp_bindings
python setup.py build_ext --inplace
# Or use CMake integration (see documentation)import hyperfit
import numpy as np
# Define experimental data
uniaxial_strain = np.array([0.1, 0.2, 0.3, 0.4, 0.5])
uniaxial_stress = np.array([100e3, 180e3, 240e3, 280e3, 300e3])
# Configuration dictionary
config = {
"model": "reduced_polynomial",
"model_order": 3,
"experimental_data": {
"uniaxial": {
"strain": uniaxial_strain,
"stress": uniaxial_stress,
}
},
"fitting_strategy": {
"initial_guess": {"method": "lls"},
"optimizer": {"methods": ["L-BFGS-B"]},
"objective_function": {"type": "relative_error"}
}
}
# Perform fitting
result = hyperfit.fit(config)
if result['success']:
print("Fitted parameters:", result['parameters'])
print("RMS error:", result['diagnostics']['rms_error'])
else:
print("Fitting failed:", result['error'])import hyperfit
import numpy as np
# Define experimental data
uniaxial_strain = np.array([0.1, 0.2, 0.3, 0.4, 0.5])
uniaxial_stress = np.array([100e3, 180e3, 240e3, 280e3, 300e3])
# PANN configuration
config = {
"model": "pann",
"hidden_dims": [32, 32], # Neural network architecture
"experimental_data": {
"uniaxial": {
"strain": uniaxial_strain,
"stress": uniaxial_stress,
}
},
"fitting_strategy": {
"epochs": 100, # Training epochs
"batch_size": 32, # Batch size
"learning_rate": 1e-3, # Learning rate
"train_ratio": 0.8, # Train/validation split
"update_normalization_freq": 10, # Frequency to update normalization
"save_checkpoint_freq": 10, # Frequency to save checkpoints
},
"pann_output_dir": "checkpoints/pann_model" # Optional: save checkpoint directory
}
# Train PANN model
result = hyperfit.fit(config)
if result['success']:
print("Training completed!")
print("Final train loss:", result['diagnostics']['final_train_loss'])
print("Final validation loss:", result['diagnostics']['final_val_loss'])
else:
print("Training failed:", result['error'])#include "hyperfit_cpp.hpp"
#include <iostream>
#include <vector>
int main() {
// Experimental data
std::vector<double> strain = {0.1, 0.2, 0.3, 0.4, 0.5};
std::vector<double> stress = {100e3, 180e3, 240e3, 280e3, 300e3};
// Fit model
auto result = fit_material_with_arrays(
"reduced_polynomial", 3, // model and order
strain, stress, // uniaxial data
{}, {}, // no biaxial data
{}, {}, // no planar data
{}, {}, // no volumetric data
"lls", // initial guess method
{"L-BFGS-B"}, // optimizer
"relative_error" // objective
);
if (is_fit_successful(result)) {
auto params = extract_parameters(result);
std::cout << "Fitting successful!" << std::endl;
// Use fitted parameters...
} else {
std::cout << "Error: " << get_error_message(result) << std::endl;
}
return 0;
}config = {
"model": "reduced_polynomial", # "ogden", "polynomial", "reduced_polynomial"
"model_order": 3, # Number of terms/pairs
"mullins_effect": False, # Enable Mullins damage (optional)
}config = {
"model": "pann", # PANN model
"hidden_dims": [32, 32], # Neural network hidden layer dimensions (list of ints)
# Optional: load pre-trained model
"checkpoint_path": "checkpoints/pann_model/best_model.pth",
# Optional: device for training
"device": "cpu", # "cpu" or "cuda"
}"experimental_data": {
"uniaxial": { # Required: at least one mechanical test
"strain": strain_array, # Engineering strain
"stress": stress_array # Nominal stress (Pa)
},
"biaxial": { # Optional
"strain": strain_array,
"stress": stress_array
},
"planar": { # Optional
"strain": strain_array,
"stress": stress_array
},
"volumetric": { # Optional: for compressible materials
"j": volume_ratio_array, # Volume ratio J = V/V₀
"pressure": pressure_array # Hydrostatic pressure (Pa)
}
}"fitting_strategy": {
"initial_guess": {
"method": "lls", # "lls" or "heuristic"
"alpha_guesses": [1, 2, 3] # For Ogden LLS (optional)
},
"optimizer": {
"methods": ["L-BFGS-B", "TNC"] # Try multiple methods
},
"objective_function": {
"type": "relative_error", # "absolute_error", "relative_error", "stress", "eta"
"weights": {...} # Optional data weighting
},
"stability_control": "post" # "post", "penalty", "ignore"
}config = {
"model": "reduced_polynomial",
"model_order": 3,
"mullins_effect": {
"r": 2.0, # Initial guess (optional)
"m": 25.0, # Initial guess (optional)
"beta": 0.1 # Initial guess (optional)
},
"experimental_data": {
# Include loading/unloading cycle data
"uniaxial": {"strain": [...], "stress": [...]}
},
"fitting_strategy": {
"objective_function": {"type": "stress"} # Use stress objective for Mullins
}
}config = {
# ... other config ...
"parameter_bounds": {
"C_10": (1e-6, 1e6), # Bounds for specific parameters
"mu_1": (-1e5, 1e5)
},
"convergence": {
"max_iterations": 1000,
"tolerance": 1e-8,
"relative_tolerance": 1e-6
}
}The PANN model is a neural network-based approach that guarantees polyconvexity and physics constraints:
- Architecture: Input Convex Neural Network (ICNN) with physics augmentation
- Guarantees: Polyconvexity (by construction), zero stress at reference, thermodynamic consistency
- Parameters: Neural network weights (automatically trained)
- Recommended Architecture:
hidden_dims=[32, 32]for most materials - Use Cases: Complex material behavior, when traditional models are insufficient, large deformation problems
- Advantages:
- No manual parameter tuning required
- Automatically learns material behavior from data
- Guaranteed physical constraints
- Can capture complex nonlinear relationships
The Reduced Polynomial (Mooney-Rivlin) model is ideal for moderate deformations:
- Strain Energy: W = Σᵢ C_i0 * (I₁ - 3)^i + Σᵢ (1/D_i) * (J - 1)^(2i)
- Parameters: C_i0 (deviatoric), D_i (volumetric)
- Recommended Order: N = 2-3 for most materials
- Use Cases: General hyperelastic materials, moderate strains
The Ogden model provides excellent flexibility for large deformations:
- Strain Energy: W = Σᵢ (2μᵢ/αᵢ²) * (λ₁^αᵢ + λ₂^αᵢ + λ₃^αᵢ - 3)
- Parameters: μᵢ, αᵢ (material constants)
- Recommended Order: N = 2-3 pairs
- Use Cases: Large deformations, biological tissues, rubber
The full Polynomial model includes both I₁ and I₂ dependence:
- Strain Energy: W = Σᵢⱼ C_ij * (I₁ - 3)^i * (I₂ - 3)^j
- Parameters: C_ij (material constants)
- Recommended Order: N = 2 for stability
- Use Cases: When I₂ dependence is significant
- Minimum Data: At least one mechanical test (uniaxial, biaxial, or planar)
- Recommended: Multiple test types for better parameter identification
- Data Quality: Ensure monotonic loading, remove noise, check units
- Strain Range: Include sufficient deformation (>10% strain recommended)
- Start Simple: Try Reduced Polynomial N=2 first
- Add Complexity: Increase order or try Ogden if needed
- Validate: Check physical reasonableness of parameters
- Cross-Validate: Test predictions on independent data
- Initial Guess: LLS generally more robust than heuristic
- Multiple Methods: Try L-BFGS-B, TNC, and trust-constr
- Stability Control: Use "post" processing for Ogden model
- Convergence: Monitor fitting diagnostics and quality metrics
hyperfit.fit(config): Main fitting functionhyperfit.HyperFitError: Base exception classhyperfit.ConfigurationError: Configuration validation errors
fit_material(config): Main C++ fitting functionfit_material_with_arrays(...): Convenience function with arraysextract_parameters(result): Extract fitted parametersis_fit_successful(result): Check fitting successget_error_message(result): Get error description
HyperFit includes comprehensive tests using real experimental material data. The test suite validates all models including PANN using comprehensive material test data:
- Uniaxial tension data: 23 data points
- Biaxial tension data: 16 data points
- Planar tension data: 13 data points
- Volumetric compression data: 4 data points
The easiest way to run tests is using the run_pipeline.py script in the root directory:
# Run all tests (recommended)
python run_pipeline.py
# Run comprehensive material data tests only
python run_pipeline.py --test comprehensive
# Run with verbose output (shows all print statements)
python run_pipeline.py --verbose
# Run specific test method
python run_pipeline.py --test comprehensive::TestComprehensiveMaterialData::test_reduced_polynomial_n4
# Run with coverage report (requires pytest-cov)
python run_pipeline.py --coverage
# Run tests in parallel (requires pytest-xdist)
python run_pipeline.py --parallel
# Get help
python run_pipeline.py --help# Install development dependencies
pip install -e ".[dev]"
# Run all tests
pytest tests/ -v
# Run with detailed output
pytest tests/ -v -s
# Run specific test file
pytest tests/test_comprehensive_material_data.py -v -sFor PANN model training, you can run the test script directly to see full training output:
python tests/run_pann_test_direct.pyThis will:
- Display complete training progress (all epochs)
- Save checkpoints to
checkpoints/pann_direct_test/ - Show final training results and checkpoint file list
The comprehensive test suite (test_comprehensive_material_data.py) includes:
- Reduced Polynomial N=4: Tests with all four data types
- Ogden Model: Tests with 2 parameter pairs using all data types
- Polynomial Model: Tests with N=2 using all data types
- PANN Model: Neural network training with all data types (requires PyTorch)
- All Traditional Models: Quick tests with uniaxial data only
- Reduced Polynomial: Comparison of different orders (N=2, 3, 4)
The test data represents real experimental measurements from hyperelastic material characterization, including:
- Stress-strain pairs for uniaxial, biaxial, and planar loading
- Pressure-volume ratio pairs for volumetric compression
- Wide strain range coverage (0.02 to 6.64)
We welcome contributions! Please see CONTRIBUTING.md for guidelines.
git clone https://github.com/hyperfit/hyperfit.git
cd hyperfit
pip install -e ".[dev]"
# Run comprehensive tests
pytest tests/test_comprehensive_material_data.py -v -s- Inherit from
HyperelasticModelbase class - Implement required abstract methods
- Add to model registry
- Include tests and documentation
This project is licensed under the MIT License - see LICENSE for details.
If you use HyperFit in your research, please cite:
@software{hyperfit,
title = {HyperFit: A Python Library for Hyperelastic Material Model Fitting},
author = {Xiaotong Wang},
year = {2024},
url = {https://github.com/hyperfit/hyperfit}
}- Documentation: https://hyperfit.readthedocs.io
- Issues: https://github.com/hyperfit/hyperfit/issues
- Discussions: https://github.com/hyperfit/hyperfit/discussions
This library builds upon established hyperelastic theory and incorporates algorithms validated against commercial finite element software. Special thanks to the computational mechanics community for their foundational work in this field.