X-Git-Url: https://piware.de/gitweb/?p=handwriting-recognition.git;a=blobdiff_plain;f=README.md;h=96974c990119387ee722e8f5d809536d0885233c;hp=ad1494553d16fddaf4c96faa725c0d27c196c459;hb=0ea12b213873b4bef12e1f2b65eed64704ee040f;hpb=729ae7ea896340b69a4021e0201b9d1c8d29ee89 diff --git a/README.md b/README.md index ad14945..96974c9 100644 --- a/README.md +++ b/README.md @@ -5,6 +5,7 @@ Basics: - [MNIST database of handwritten digits](http://yann.lecun.com/exdb/mnist/) - [Neuron](https://en.wikipedia.org/wiki/Artificial_neuron) - [Perceptron](https://en.wikipedia.org/wiki/Perceptron) + - [Backpropagation](https://en.wikipedia.org/wiki/Backpropagation) - [3Blue1Brown video series](https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi) Too high-level for first-time learning, but apparently very abstract and powerful for real-life: @@ -27,9 +28,24 @@ plt.imshow(grad, cmap='gray') plt.show() plt.imshow(np.sin(np.linspace(0,10000,10000)).reshape(100,100) ** 2, cmap='gray') -# does not work with QT_QPA_PLATFORM=wayland +# non-blocking does not work with QT_QPA_PLATFORM=wayland plt.show(block=False) plt.close() ``` - Get the handwritten digits training data with `./download-mnist.sh` + + - Read the MNIST database into numpy arrays with `./read_display_mnist.py`. Plot the first ten images and show their labels, to make sure the data makes sense: + + ![visualize training data](screenshots/mnist-visualize-training-data.png) + + - Define the structure of the neural network: two hidden layers with parametrizable sizes. Initialize weights and biases randomly. This gives totally random classifications of course, but at least makes sure that the data structures and computations work: + +``` +$ ./train.py +output vector of first image: [ 0. 52766.88424917 0. 0. + 14840.28619491 14164.62850135 0. 7011.882333 + 0. 46979.62976127] +classification of first image: 1 with confidence 52766.88424917019; real label 5 +correctly recognized images after initialization: 10.076666666666668% +```