Skip to content

Add automatic clipping support to Opacus #797

@ParthS007

Description

@ParthS007

🚀 Feature

Add automatic clipping support to Opacus with DPAutomaticClippingOptimizer and DPPerLayerAutomaticClippingOptimizer classes that adaptively scale gradients without manual threshold tuning.

Motivation

Manual tuning of clipping threshold is time-consuming. Different layers may need different clipping thresholds, but current Opacus requires either a global threshold or manual per-layer configuration. This can result in poor model utility.

Pitch

Implement automatic clipping based on Automatic Clipping: Differentially Private Deep Learning Made Easier and Stronger:

Formula: clip_factor = max_grad_norm / (per_sample_norms + 0.01) instead of min(1.0, max_grad_norm / per_sample_norms)

Usage:

privacy_engine = PrivacyEngine()
model, optimizer, dataloader = privacy_engine.make_private(
    module=model,
    optimizer=optimizer,
    data_loader=dataloader,
    noise_multiplier=1.0,
    max_grad_norm=1.0,
    clipping="automatic" or "automatic_per_layer",  # New options
)
  • Eliminates hyperparameter tuning
  • Adaptive per-sample clipping improves utility
  • Maintains same privacy guarantees
  • Easy API integration by following the docs

Additional context

I'm happy to submit a PR for this feature if it aligns with the project's goals. I'm implementing automatic clipping as part of my thesis work, and I believe making it available in the main library could benefit the broader Opacus community.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions