]> piware.de Git - handwriting-recognition.git/commit
Add reLU layer implementation master
authorMartin Pitt <martin@piware.de>
Sun, 30 Aug 2020 08:20:14 +0000 (10:20 +0200)
committerMartin Pitt <martin@piware.de>
Sun, 30 Aug 2020 09:40:28 +0000 (11:40 +0200)
commitd986cacf7fc94fb78904f01e11128d666efff804
treee801807813c228fd9f54ec5ca3e76c16086dbedc
parent59f4fd752941f39ddcd36760202a0dc742747106
Add reLU layer implementation

This requires normalizing the input data to [0,1], otherwise the data
gets wildly out of range. But normalizing the input range makes Sigmoid
worse, so don't do this by default.

Even with normalization, reLU still performs slightly worse than
Sigmoid, though.
README.md
nnet.py