Am Do., 7. Jan. 2021 um 04:06 Uhr schrieb Sergei Marchenko < serge_v_m@hotmail.com>:
Please consider to use and contribute to Boost.uBlas https://github.com/boostorg/ublas which recently added tensor data types and operations with the convenient Einstein notation :
tensor_t C = C + A(_i,_j,_k)*B(_j,_l,_i,_m) + 5;
Thank you Cem for the suggestion! uBlas::opencl definitely looks interesting, since many basic NN layers can be implemented using various element-wise functions, and the hardware support that comes with it is very appealing. The Einstein tensor notation is convenient for multi-dimensional convolution and pooling layers, although I feel that C++ 17 requirement for tensor extension is probably too strong. I will need to experiment with the library a bit more to get a better sense of what it means to implement NN abstractions on top of it.
Sure. Just let me know if you need help. The contraction is not optimized. If you need optimized versions, please let me know - we are working on it right now. We are preparing faster implementations for Tensor-Times-Vector and Tensor-Times-Matrix. (E.g. https://github.com/bassoy/ttv).
Best regards, Sergei Marchenko.
Best, Cem