Question on calculating gradient by external error

Hi, I have a question regarding using external error to calculate gradient as this example: MultiLayerNetworkExternalErrors.java

If my nn is like this:

MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
        .weightInit(WeightInit.XAVIER)
        .activation(Activation.RELU)
        .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
        .updater(new Nesterovs(0.001,0.9))
        .list()
        .layer(0, new DenseLayer.Builder().nOut(20).build())
        .layer(1, new DenseLayer.Builder().activation(Activation.SOFTMAX).nOut(4).build())
        .setInputType(InputType.feedForward(1))
        .build();

To calculate the gradient, should I use the vanila error or the error after softmax activation in the net.backpropGradient(error,null) function?

The error is the result of your loss function. Your (external) loss function gets the result of the network after the activation, so you should be calculating it after the activation.