OK, so the problem here turned out to be these reused fields:
conv_activation_layer = LeakyReLU(0.2)
deconv_activation_layer = ReLU()
Keras is actually treating these as shared layers, instead of what you’d expect which is just duplicating them. Technically it’s a valid model but as an architecture having one activation layer used multiple times doesn’t make too much sense in practice.
So there are 2 simple possible workarounds here:
(a) replace these definitions with a lambda (so that a new activation layer instance will be created on each call:
conv_activation_layer = lambda x: LeakyReLU(0.2)(x)
deconv_activation_layer = lambda x: ReLU()(x)
Or replace these sorts calls: rel5 = conv_activation_layer(batch5)
with something like this: rel5 = LeakyRelu(0.2)(batch5)
At some point we’ll try to work around this for shared activation layers, but we aren’t supporting shared layers with parameters for now (but can throw a proper/better exception).