X-Git-Url: https://piware.de/gitweb/?p=handwriting-recognition.git;a=blobdiff_plain;f=README.md;h=3af0686d6972af783a60d8ca377d150c54e01cad;hp=760002f714dd303e786e5d55989023d9a15d2ede;hb=1de3cdb5ecba32a8a3b0a02bbf71e883383a689d;hpb=579223dbae47c81cd315f5b575bfd9f6647890f5 diff --git a/README.md b/README.md index 760002f..3af0686 100644 --- a/README.md +++ b/README.md @@ -6,6 +6,7 @@ Basics: - [Neuron](https://en.wikipedia.org/wiki/Artificial_neuron) - [Perceptron](https://en.wikipedia.org/wiki/Perceptron) - [Backpropagation](https://en.wikipedia.org/wiki/Backpropagation) + - [Understanding & Creating Neural Networks with Computational Graphs from Scratch](https://www.kdnuggets.com/2019/08/numpy-neural-networks-computational-graphs.html) - [3Blue1Brown video series](https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi) Too high-level for first-time learning, but apparently very abstract and powerful for real-life: @@ -67,3 +68,20 @@ real 0m37.927s user 1m19.103s sys 1m10.169s ``` + + - This is way too slow. I found an [interesting approach](https://www.kdnuggets.com/2019/08/numpy-neural-networks-computational-graphs.html) that harnesses the power of numpy by doing the computations for lots of images in parallel, instead of spending a lot of time in Python on iterating over tens of thousands of examples. Now the accuracy computation takes only negligible time instead of 6 seconds, and each round of training takes less than a second: +``` +$ time ./train.py +output vector of first image: [0.50863223 0.50183558 0.50357349 0.50056673 0.50285531 0.5043152 + 0.51588292 0.49403 0.5030618 0.51006963] +classification of first image: 6 with confidence 0.5158829224337754; real label 7 +correctly recognized images after initialization: 9.58% +cost after training round 0: 1.0462266880961681 +[...] +cost after training round 99: 0.4499245817840479 +correctly recognized images after training: 11.35% + +real 1m51.520s +user 4m23.863s +sys 2m31.686s +```