Code for Orthogonalising gradients to speed up neural network optimisation
git clone https://github.com/MarkTuddenham/Orthogonal-Optimisers.git
cd Orthogonal-Optimisers
pip install .
or
pip install git+https://github.com/MarkTuddenham/Orthogonal-Optimisers.git#egg=orth_optim
And then at the top of your main python script:
from orth_optim import hook
hook()
Now the torch optimisers have an orthogonal option, e.g:
torch.optim.SGD(model.parameters(),
lr=1e-3,
momentum=0.9,
orth=True)
If you have a custom optimiser you can apply the orthogonalise
decorator.
from orth_optim import orthogonalise
@orthogonalise
class LARS(torch.optim.Optimizer):
...