In this paper, an alternative way to backpropagation is introduced and tested, which results in faster convergence of the cost function.
Method Article
A way to reduce the time consumption effect of for-loops for training neural networks: optimized propagation
https://doi.org/10.21203/rs.3.rs-776504/v1
This work is licensed under a CC BY 4.0 License
posted
You are reading this older preprint version
In this paper, an alternative way to backpropagation is introduced and tested, which results in faster convergence of the cost function.
Artificial Neural Networks
The author declares no competing interests.
posted
You are reading this older preprint version