Martin Pitt [Sat, 29 Aug 2020 19:59:56 +0000 (21:59 +0200)]
Tweak network structure
Increase first layer size, otherwise the network does not "see" enough
detail in the input. Drop second layer, it makes the recognition rate
actually much worse.
Martin Pitt [Sat, 29 Aug 2020 19:29:39 +0000 (21:29 +0200)]
Process many images in parallel
Provide one object per NN layer and implement their functionality
separately, like in
https://www.kdnuggets.com/2019/08/numpy-neural-networks-computational-graphs.html
Each layer does not take only one image vector, but a whole 10,000 of
them, which massively speeds up the computation -- much less time spent
in Python iterations.
Martin Pitt [Sat, 29 Aug 2020 10:48:59 +0000 (12:48 +0200)]
Initial Neural network with forward feeding
Two hidden layers with parametrizable size. Two possible transfer
functions, defaulting to reLU for now.
Initialize weights and biases randomly. This gives totally random
classifications of course, but at least makes sure that the data
structures and computations work.
Also already add a function to recognize the test images and count
correct ones. Without trainingh, 10% of the samples are expected to be
right by pure chance.