Custom loss function in ComputationGraph, composing mean square and leakyReLU

Hello,
I’d like to build a custom loss function for a Computation Graph (with a single output) simply defined as the mean square of the leaky ReLU of the difference between output and label. (i.e., a loss function that doesn’t penalize much when the output is less than the label, but penalizes a lot when it is greater then it, and I have seen other people on the internet interested in such a loss function).
If I’m not mistaken the derivative (for a scalar output, that is my case) is simply 2*lReLU(y-y*)*[H(y-y*)+a*(1-H(y-y*))], where y is output, y* is label, H heaviside function and a<<1 the leakyness of the lReLU. I’ve found some broken links, and I suspect the documentation on the subject is still work in progress (or I haven’t found the most recent sources)… Where can I refer to in order to implement it? Is there a simple way to do it?
Thank you very much in advance

@EquanimeAugello you can do custom losses but with the flexibility you want I strongly suggest looking at samediff. Depending on your response I can show you how to get started with either.

Thank you very much,
if you suggest proceeding with samediff I’ll be glad to learn more about it.
If it can be of any use, up to now I have written this configuration:

ComputationGraphConfiguration conf = customResBlocks(
                new NeuralNetConfiguration.Builder()
                .weightInit(WeightInit.XAVIER)
                .updater(new Sgd(0.01))
                .graphBuilder()
                .addInputs("A_0_3"), boardSize, nBlocks //number of residual blocks
            )
            .addLayer("penultimo", new DenseLayer.Builder().nIn(arrayArea).nOut(arrayArea).build(), "nBlocks"+"_3")
            .addLayer("ultimo", new DenseLayer.Builder().nIn(arrayArea).nOut(1).build(),"penultimo")
            .addLayer("output", new OutputLayer.Builder()
                    .lossFunction(LossFunctions.LossFunction.NEGATIVELOGLIKELIHOOD) // Here I need a custom one
                    .nIn(4).nOut(1).build(), "ultimo")
            .setOutputs("output")
            .build();
public GraphBuilder customResBlocks(GraphBuilder previousArchitecture, int boardSize, int nBlocks){ //il vertice/layer di input si deve chiamare "A_0_3" perchè funzioni
        GraphBuilder resBlocks=previousArchitecture;
        for(int i=1;i<=nBlocks; i++){
            resBlocks=resBlocks
            .addLayer(i+"_1", new ConvolutionLayer.Builder()
                        .kernelSize(3)
                        .padding(1)
                        .nIn(1)
                        //Note that nIn need not be specified in later layers
                        .stride(1,1)
                        .nOut(boardSize)
                        //.activation(Activation.LEAKYRELU)
                        .build(),(i-1)+"_3" )
            .addLayer("BN_"+i+"_1", new BatchNormalization.Builder().build(), i+"_1")
            .addLayer("A_"+i+"_1", new ActivationLayer(Activation.LEAKYRELU), "BN_"+i+"_1" )
            .addLayer(i+"_2", new ConvolutionLayer.Builder()
                        .kernelSize(3)
                        .padding(1)
                        .nIn(boardSize)
                        //Note that nIn need not be specified in later layers
                        .stride(1,1)
                        .nOut(boardSize)
                        //.activation(Activation.LEAKYRELU)
                        .build(),i+"_1" )
            .addLayer("BN_"+i+"_2", new BatchNormalization.Builder().build(), i+"_2")
            .addVertex(i+"_3", new ElementWiseVertex(ElementWiseVertex.Op.Add), "BN_"+i+"_2", "A_"+(i-1)+"_3")
            .addLayer("A_"+i+"_3", new ActivationLayer(Activation.LEAKYRELU), i+"_3" );
        }
        return resBlocks;
    }

@EquanimeAugello take a look at this then and see how far you can get: deeplearning4j-examples/dl4j-examples/src/main/java/org/deeplearning4j/examples/advanced/features/customizingdl4j/lossfunctions at 051c59bd06b38ed39ca92f5940a6ca43b0f34c0f · deeplearning4j/deeplearning4j-examples · GitHub

I mentioned potentially doing samediff because it’s a bit easier to work with when you want something super specific.

Thank you very much.