# Custom loss function in ComputationGraph, composing mean square and leakyReLU

Hello,
I’d like to build a custom loss function for a Computation Graph (with a single output) simply defined as the mean square of the leaky ReLU of the difference between output and label. (i.e., a loss function that doesn’t penalize much when the output is less than the label, but penalizes a lot when it is greater then it, and I have seen other people on the internet interested in such a loss function).
If I’m not mistaken the derivative (for a scalar output, that is my case) is simply `2*lReLU(y-y*)*[H(y-y*)+a*(1-H(y-y*))]`, where y is output, y* is label, H heaviside function and a<<1 the leakyness of the lReLU. I’ve found some broken links, and I suspect the documentation on the subject is still work in progress (or I haven’t found the most recent sources)… Where can I refer to in order to implement it? Is there a simple way to do it?
Thank you very much in advance

@EquanimeAugello you can do custom losses but with the flexibility you want I strongly suggest looking at samediff. Depending on your response I can show you how to get started with either.

Thank you very much,
If it can be of any use, up to now I have written this configuration:

``````ComputationGraphConfiguration conf = customResBlocks(
new NeuralNetConfiguration.Builder()
.weightInit(WeightInit.XAVIER)
.updater(new Sgd(0.01))
.graphBuilder()
.addInputs("A_0_3"), boardSize, nBlocks //number of residual blocks
)
.lossFunction(LossFunctions.LossFunction.NEGATIVELOGLIKELIHOOD) // Here I need a custom one
.nIn(4).nOut(1).build(), "ultimo")
.setOutputs("output")
.build();
``````
``````public GraphBuilder customResBlocks(GraphBuilder previousArchitecture, int boardSize, int nBlocks){ //il vertice/layer di input si deve chiamare "A_0_3" perchè funzioni
GraphBuilder resBlocks=previousArchitecture;
for(int i=1;i<=nBlocks; i++){
resBlocks=resBlocks
.kernelSize(3)
.nIn(1)
//Note that nIn need not be specified in later layers
.stride(1,1)
.nOut(boardSize)
//.activation(Activation.LEAKYRELU)
.build(),(i-1)+"_3" )
.kernelSize(3)
.nIn(boardSize)
//Note that nIn need not be specified in later layers
.stride(1,1)
.nOut(boardSize)
//.activation(Activation.LEAKYRELU)
.build(),i+"_1" )