Warning: 1 class was never predicted by the model and was excluded from average precision Classes excluded from average precision: [1]

I try to educate binary classifier. I have 178200 records and i checked that in this data i have a lot of 1 and 0 as a result.
But i get

here is my code:

public class Network {

    private static final int CLASSES_COUNT = 2;
    private static final int FEATURES_COUNT = 1000;

    public void train() throws IOException, InterruptedException {
        int seed = 123456;

        int numInputs = 1000;
        int numOutputs = 2;
        int numHiddenNodes = 2 * numInputs + numOutputs;
        double learningRate = 0.005;

        RecordReader recordReader = new CSVRecordReader(0, ',');
        recordReader.initialize(new FileSplit(new ClassPathResource("data.csv").getFile()));
        DataSet allData;

        System.out.println("start to read data");
        DataSetIterator iterator = new RecordReaderDataSetIterator(recordReader, 17820, FEATURES_COUNT, CLASSES_COUNT);
        allData = iterator.next();
        //System.out.println("read all data");

        SplitTestAndTrain testAndTrain = allData.splitTestAndTrain(0.65);
        DataSet trainingData = testAndTrain.getTrain();
        DataSet testData = testAndTrain.getTest();
        //DataSet testData = iterator.next();

        System.out.println("splitted data start to build configuration");
        MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
                .updater(new Nesterovs(learningRate, 0.9))
                .layer(0, new DenseLayer.Builder().nIn(numInputs).nOut(numHiddenNodes)
                .layer(1, new DenseLayer.Builder().nIn(numHiddenNodes).nOut(numHiddenNodes)
                .layer(2, new OutputLayer.Builder(LossFunctions.LossFunction.XENT)

        MultiLayerNetwork model = new MultiLayerNetwork(conf);

        System.out.println("compiled configuration");
        System.out.println("start to fit model");
        /*DataSet allData2;
        DataSetIterator iterator2 = new RecordReaderDataSetIterator(recordReader, 1700, FEATURES_COUNT, CLASSES_COUNT);

        //for(int i = 0; i < 13; i++) {
       while (iterator.hasNext()) {
           //allData = iterator.next();
           model.fit(trainingData.getFeatures(), trainingData.getLabels());
           allData = iterator.next();
        //System.out.println("ended epoch number : " + i);
        System.out.println("fit ended start evaluating");

        INDArray output = model.output(testData.getFeatures());

        Evaluation eval = new Evaluation(CLASSES_COUNT);
        //Evaluation eval = new EvaluationBinary();
        eval.eval(testData.getLabels(), output);
        //EvaluationBinary eval = new EvaluationBinary(testData.getLabels());


        //File locationToSave = new File("");
        model.save(new File("mynet.zip"));



What you are seeing is just the effect of the network not training properly.

You will have to tune its hyper parameters to find something where your networks learns. You are trying to do quite a lot of different stuff in your code - and you are using huge minibatches, this isn’t always a good idea.

Take a look at https://deeplearning4j.konduit.ai/tuning-and-training/troubleshooting-training#troubleshooting-neural-net-training for more on tuning your network.

If you are just starting out with DL4J you might also want to take a look at this: https://www.dubs.tech/guides/quickstart-with-dl4j/

Also the examples have been reworked recently, so you might want to take another look at how things are done there: https://github.com/eclipse/deeplearning4j-examples/tree/master/dl4j-examples#classification