Releases: modern-fortran/neural-fortran
neural-fortran-0.13.0
What's Changed
- Contributing guide by @milancurcic in #109
- GSoC Optimizers: Example program to fit a quadratic function by @Spnetic-5 in #134
- Update contributors and acknowledgment by @milancurcic in #138
- Add missing update for
conv2d_layer
by @milancurcic in #141 - Example program that shows how to access internal layer parameters by @milancurcic in #140
- add CELU activation function by @pablomazo in #143
- Added RMSProp Optimizer subroutine by @Spnetic-5 in #144
- SGD optimizer stub by @milancurcic in #139
- Connect
flatten
,conv2d
, andmaxpool2d
layers in backward pass by @milancurcic in #142 - Added Momentum and Nesterov modifications by @Spnetic-5 in #148
New Contributors
- @pablomazo made their first contribution in #143
Full Changelog: v0.12.0...v0.13.0
neural-fortran-0.12.0
What's Changed
- Added Leaky ReLU activation function (1D & 3D) by @Spnetic-5 in #123
- Class-based activation functions by @ggoyman in #126
New Contributors
- @Spnetic-5 made their first contribution in #123
- @ggoyman made their first contribution in #126
Full Changelog: v0.11.0...v0.12.0
neural-fortran-0.11.0
What's Changed
- Add missing edit descriptor by @milancurcic in #113
- Small workaround to make Intel compiler happy by @milancurcic in #117
- Introduce optimizer_base_type in support of different optimizers by @milancurcic in #116
- Insert a flatten layer if a dense layer follows a layer by @milancurcic in #118
- Add linear activation function by @milancurcic in #119
- Update contributors by @milancurcic in #120
Full Changelog: v0.10.0...v0.11.0
neural-fortran-0.10.0
This release introduces get_num_params
, get_params
, and set_params
methods to network
and layer
derived types, and allow you to more easily get and set network hyperparameters from custom Fortran code or other libraries. See the example to learn how it works.
Thanks to Christopher Zapart @jvo203 for this feature contribution.
What's Changed
- Fix CMake build by @milancurcic in #110
- Get set network parameters by @milancurcic in #111
Full Changelog: v0.9.0...v0.10.0
neural-fortran-0.9.0
This release introduces the backward passes for the conv2d
and maxpool2d
layers and enables the training of convolutional networks.
What's Changed
- CNN backward pass by @milancurcic in #99
Full Changelog: v0.8.0...v0.9.0
neural-fortran-0.8.0
This release introduces the reshape
layer for connecting rank-1 layers to rank-3 layers, including the capability to read Keras's Reshape
layer from h5 files.
What's Changed
- Bump version in fpm.toml by @milancurcic in #96
- Implement the reshape layer by @milancurcic in #97
Full Changelog: v0.7.0...v0.8.0
neural-fortran-0.7.0
What's Changed
- CI by @milancurcic in #87
- Batch inference by @milancurcic in #90
- Rename output -> predict for consistency with Keras by @milancurcic in #92
- Update README by @milancurcic in #93
- Fix accidental TOC reorder from a previous PR by @milancurcic in #94
Full Changelog: v0.6.0...v0.7.0
neural-fortran-0.6.0
What's Changed
- Update fpm instructions in light of the required HDF5 dependency by @milancurcic in #80
- Add CITATION.cff by @milancurcic in #81
- fixed a memory leak when reading the JSON by @jacobwilliams in #83
- Update contributors list by @milancurcic in #84
- Read Conv2D, MaxPooling2D, and Flatten layers from Keras by @milancurcic in #85
New Contributors
- @jacobwilliams made their first contribution in #83
Full Changelog: v0.5.0...v0.6.0
neural-fortran-0.5.0
What's Changed
- Support for loading Keras models by @milancurcic in #79. Many thanks to @scivision for a complete CMake overhaul and for assisting with adding h5fortran as a dependency. This is an experimental and minimally tested feature, supporting only input and dense layers saved in a Keras HDF5 file. See the mnist_from_keras example to see how it works.
- HDF5, h5fortran, and json-fortran are now required dependencies. HDF5 you have to provide to the build system. The latter two are taken care of automatically.
Full Changelog: v0.4.0...v0.5.0
neural-fortran-0.4.0
What's Changed
- Add note about downloading MNIST data without curl by @milancurcic in #67
- Forward pass for the conv2d layer by @milancurcic in #65
- Forward pass for a max-pooling layer by @milancurcic in #66
- Fix layers summary table by @milancurcic in #70
- fix #72 - use dir tree to expose user API by @rouson in #74
- Clean up example and add a note to emphasize the user API by @milancurcic in #76
- Fix CMake build for the new src directory structure by @milancurcic in #77
- Implement a flatten layer by @milancurcic in #75
Full Changelog: v0.3.0...v0.4.0