After appropriately patching Keras, I proceeded with PReLUs. Unfortunately, I’ve not matched plain ReLUs for this task, albeight I came close. At Epoch 61, I scored a low of 1.72% training and 2.84% validation error, passably good but not as good as the 2.48% validation error my plain ReLU network achieved. The training curves can … Continue reading Close, but no cigar: PReLUs give ~2.84%