Closed
Description
I think neural networks may be too broad a topic for stdlib (there are a number of Fortran projects in this area), but activation functions and their derivatives could be considered.
Metadata
Metadata
Assignees
Labels
No labels
I think neural networks may be too broad a topic for stdlib (there are a number of Fortran projects in this area), but activation functions and their derivatives could be considered.
Activity
jalvesz commentedon Aug 10, 2024
That's a good idea, maybe something like this could be a starting point:
Click me: stdlib_math_activations.fypp
Some of them would be more interesting with the fast versions of some of the intrinsic functions. A companion
stdlib_math_fast
orstdlib_fast_math
could be included.Beliavsky commentedon Mar 7, 2025
Since the GELU function is included in the code of @jalvesz, the approximations to it described at https://www.johndcook.com/blog/2025/03/06/gelu/ could also be considered for addition. Other posts by John D. Cook discussing activation functions are https://www.johndcook.com/blog/2023/08/06/swish-swiss/ and https://www.johndcook.com/blog/2023/08/06/swish-mish-and-serf/.
jalvesz commentedon Mar 7, 2025
Can his formula be adopted in stdlib without licensing issues?
It seems it should not, as mathematical formulas (as opposed to code) are in principle not protected by of copy-right. If someone else could confirm, I can include this formula in the PR.
jalvesz commentedon Mar 29, 2025
@Beliavsky would you mind reviewing the PR #860 ?