4 - [Learn numpy](https://numpy.org/learn/)
5 - [MNIST database of handwritten digits](http://yann.lecun.com/exdb/mnist/)
6 - [Neuron](https://en.wikipedia.org/wiki/Artificial_neuron)
7 - [Perceptron](https://en.wikipedia.org/wiki/Perceptron)
8 - [Backpropagation](https://en.wikipedia.org/wiki/Backpropagation)
9 - [Understanding & Creating Neural Networks with Computational Graphs from Scratch](https://www.kdnuggets.com/2019/08/numpy-neural-networks-computational-graphs.html)
10 - [3Blue1Brown video series](https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi)
12 Too high-level for first-time learning, but apparently very abstract and powerful for real-life:
13 - [keras](https://keras.io/)
14 - [tutorial how to recognize handwriting with keras/tensorflow](https://data-flair.training/blogs/python-deep-learning-project-handwritten-digit-recognition/)
18 sudo dnf install -y python3-numpy python3-matplotlib
22 - Do the [NumPy quickstart tutorial](https://numpy.org/devdocs/user/quickstart.html); example:
26 import matplotlib.pyplot as plt
27 grad = np.linspace(0,1,10000).reshape(100,100)
28 plt.imshow(grad, cmap='gray')
31 plt.imshow(np.sin(np.linspace(0,10000,10000)).reshape(100,100) ** 2, cmap='gray')
32 # non-blocking does not work with QT_QPA_PLATFORM=wayland
37 - Get the handwritten digits training data with `./download-mnist.sh`
39 - Read the MNIST database into numpy arrays with `./read_display_mnist.py`. Plot the first ten images and show their labels, to make sure the data makes sense:
41 ![visualize training data](screenshots/mnist-visualize-training-data.png)
43 - Define the structure of the neural network: two hidden layers with parametrizable sizes. Initialize weights and biases randomly. This gives totally random classifications of course, but at least makes sure that the data structures and computations work:
47 output vector of first image: [ 0. 52766.88424917 0. 0.
48 14840.28619491 14164.62850135 0. 7011.882333
50 classification of first image: 1 with confidence 52766.88424917019; real label 5
51 correctly recognized images after initialization: 10.076666666666668%
54 - Add backpropagation algorithm and run a first training round. This is slow, as expected:
57 output vector of first image: [ 0. 52766.88424917 0. 0.
58 14840.28619491 14164.62850135 0. 7011.882333
60 classification of first image: 1 with confidence 52766.88424917019; real label 5
61 correctly recognized images after initialization: 10.076666666666668%
62 round #0 of learning...
63 ./train.py:18: RuntimeWarning: overflow encountered in exp
64 return 1 / (1 + np.exp(-x))
65 correctly recognized images: 14.211666666666666%
72 - This is way too slow. I found an [interesting approach](https://www.kdnuggets.com/2019/08/numpy-neural-networks-computational-graphs.html) that harnesses the power of numpy by doing the computations for lots of images in parallel, instead of spending a lot of time in Python on iterating over tens of thousands of examples. Now the accuracy computation takes only negligible time instead of 6 seconds, and each round of training takes less than a second:
75 output vector of first image: [0.50863223 0.50183558 0.50357349 0.50056673 0.50285531 0.5043152
76 0.51588292 0.49403 0.5030618 0.51006963]
77 classification of first image: 6 with confidence 0.5158829224337754; real label 7
78 correctly recognized images after initialization: 9.58%
79 cost after training round 0: 1.0462266880961681
81 cost after training round 99: 0.4499245817840479
82 correctly recognized images after training: 11.35%