Skip to content

Latest commit

 

History

History
43 lines (33 loc) · 893 Bytes

README.md

File metadata and controls

43 lines (33 loc) · 893 Bytes

Orthogonalised Optimisers

Code for Orthogonalising gradients to speed up neural network optimisation

Install package

git clone https://github.com/MarkTuddenham/Orthogonal-Optimisers.git
cd Orthogonal-Optimisers
pip install .

or

pip install git+https://github.com/MarkTuddenham/Orthogonal-Optimisers.git#egg=orth_optim

Usage

And then at the top of your main python script:

from orth_optim import hook
hook()

Now the torch optimisers have an orthogonal option, e.g:

torch.optim.SGD(model.parameters(),
                lr=1e-3,
                momentum=0.9,
                orth=True)

Custom Optimisers

If you have a custom optimiser you can apply the orthogonalise decorator.

from orth_optim import orthogonalise

@orthogonalise
class LARS(torch.optim.Optimizer):
	...