Setting L2 part way through training

Is there an easy way to set L2 part way through training, rather than just when the model is created? I can see how to vary the learning rate on an existing model but don’t see a corresponding L2 method. Should I be creating NeuralNetConfiguration and attaching it to each layer of my existing model?

Nope - that doesn’t work.

It looks like I need to make custom Schedule.

Is this still the case? And can we change DropOut or WeightDecay programmatically?

@DonaldAlan l1/l2 is generally changed like this: deeplearning4j/GradientCheckTests.java at f9aebec79e18671c1c5680b31262467f8e83d30f · eclipse/deeplearning4j · GitHub

I mean: can we change it DYNAMICALLY while a MultiLayerNetwork is learning (not just in config)? We can change learningRate.

@DonaldAlan in that case not that I can see. Typically you only do learning rates on a schedule, I’m not sure I’ve seen l2 penalty on a schedule.