After a period of highlighting and detail, I had to get an overview of the chapters beyond chapter 4 in “TensorFlow for Deep Learning.”
Convolutional Neural Networks look most promising, in passing.
Meanwhile, I have to start poring over the statistical examples in Chapter 4 (see below: ‘Impediments’), to make sure I’ve got clear visions of the dimensions of tensors in NumPy and TensorFlow.
Ultimately, I am not sure whether my system is itself a version of neural network, since it involves a sort of backpropagation and one-hot juggling, or just a way of rejiggering MIDI data sets. Could be both.
