Clipping weights (via Constraints?)

I’m trying to implement Wasserstein GAN in dl4j. Read-through: Wasserstein GAN It requires weight clipping. The python code for wgan has weights = [np.clip(w, -self.clip_value, self.clip_value) for w in weights]. That clips each value, not the norm of vector. I suspect that dl4j’s MaxNormConstraint may do a similar job, but that seems to operate on the entire vector of values, clipping the norm of the vector. How can I efficiently and elegantly constrain the weights (not the gradients or activations) in dl4j?

I see, though, that gradient penalties outperform weight constraints: https://arxiv.org/pdf/1704.00028.pdf