]> piware.de Git - handwriting-recognition.git/blobdiff - README.md
Add reLU layer implementation
[handwriting-recognition.git] / README.md
index d48ebf3bc0fd713a836bbb238992941d5504f7aa..b309254a3a2a67284c24db6f1af08e704bc8ffaf 100644 (file)
--- a/README.md
+++ b/README.md
@@ -112,3 +112,12 @@ output vector of first image: [1.11064478e-02 5.59058012e-03 5.40483856e-02 7.93
 real   4m10.904s
 user   11m21.203s
 ```
+
+- Replace [Sigmoid](https://en.wikipedia.org/wiki/Sigmoid_function) activation function with [reLU](https://en.wikipedia.org/wiki/Rectifier_%28neural_networks%29). Some interesting effects, like a learning rate of 1 leads to "overshooting", and the cost function actually _increases_ during the learning steps several  times, and the overall result was worse. Changing the learning rate to linearly fall during the training rounds helps. But in the end, the result is still worse:
+```
+cost after training round 99: 0.07241763398153217
+correctly recognized images after training: 92.46%
+output vector of first image: [0.         0.         0.         0.         0.         0.
+ 0.         0.89541759 0.         0.        ]
+classification of first image: 7 with confidence 0.8954175907939048; real label 7
+```