Autograd is a forward and reverse mode Automatic Differentiation. Autograd also supports optimization. Automatic differentiation is particularly important in the field of machine learning. For example, it allows one to implement backpropagation in a neural network without a manually-computed derivative
https://en.wikipedia.org/wiki/Automatic_differentiation
First compile the code.
$ g++ tensor.cpp -o output
(Optional) compile with multi-threading support.
$ g++ -fopenmp -O3 tensor.cpp -o output
$ ./out
(Optional) compile with GPU acceleration through cuda.
$ cd build
$ cmake ..
$ make
$ ./tensor_test