One additional note. I read a couple of papers (sorry I miss the link now) talking about NNs compression by removing filters with low entropy value. I tried some quick tests by setting to zero the value of some filters and checking the outputs. I used different metrics to classify the filters, eventually I came to the conclusion that the decrease in accuracy was not worth the compression. But I noticed also that entropy didn’t give useful values, I tested first resetting filters with low entropy and then tose with high entropy, the accuracy loss was similar to the removal of random filters.
Did anybody else try to do something similar? (is it worth a new thread?)