Spark evaluationRegression causes NullPointerException

Hello, here’s my code:
training part:

val lstmLayerSize = Seq(50, 100) //Number of units in each LSTM layer

//Set up network configuration:
val conf = new NeuralNetConfiguration.Builder()
        .seed(DateTime.now().getMillis)
        .l2(0.001)
        .weightInit(WeightInit.XAVIER)
        .updater(new RmsProp(0.01))
        .list
        .layer(new LSTM.Builder()
            .nIn(fNames.size+13)   
            .nOut(lstmLayerSize(0))
            .activation(Activation.TANH)
            .dropOut(0.2)
            .gradientNormalization(GradientNormalization.ClipElementWiseAbsoluteValue)
            .gradientNormalizationThreshold(10)
            .build)
        .layer(new LSTM.Builder()
            .nIn(lstmLayerSize(0))
            .nOut(lstmLayerSize(1))
            .activation(Activation.TANH).dropOut(0.2)
            .gradientNormalization(GradientNormalization.ClipElementWiseAbsoluteValue)
            .gradientNormalizationThreshold(10)
            .build)
       .layer(new RnnOutputLayer.Builder(LossFunctions.LossFunction.MEAN_ABSOLUTE_ERROR)
            .activation(Activation.IDENTITY)
            .gradientNormalization(GradientNormalization.ClipElementWiseAbsoluteValue)
            .gradientNormalizationThreshold(10)
            .nIn(lstmLayerSize(1)).nOut(1)
            .build)
         .build

val examplesPerDataSetObject = 1

val averagingFrequency = 16

val batchSizePerWorker = 128

val numEpochs = 50  
val tm = new ParameterAveragingTrainingMaster.Builder(examplesPerDataSetObject)
            .rddTrainingApproach(RDDTrainingApproach.Direct)
            .workerPrefetchNumBatches(2)    //Asynchronously prefetch up to 2 batches
            .averagingFrequency(averagingFrequency)
            .batchSizePerWorker(batchSizePerWorker)
            .build();

val sparkNetwork = new SparkDl4jMultiLayer(sparkSession.sparkContext, conf, tm)
sparkNetwork.setListeners(new ScoreIterationListener(1))

val saver = new InMemoryModelSaver[MultiLayerNetwork]
val esConf = new EarlyStoppingConfiguration.Builder()
        .epochTerminationConditions(
            new MaxEpochsTerminationCondition(numEpochs),
            new ScoreImprovementEpochTerminationCondition(5, 1e-3)
        )
        .iterationTerminationConditions(
            new MaxTimeIterationTerminationCondition(1, TimeUnit.MINUTES)  //TODO
        )
        .scoreCalculator(new SparkDataSetLossCalculator(validRdd, true, sparkSession.sparkContext) )
        .modelSaver(saver)
        .build()

val trainer = new SparkEarlyStoppingTrainer(sparkSession.sparkContext, tm, esConf, sparkNetwork.getNetwork, trainRdd.toJavaRDD())

val trainResult: EarlyStoppingResult[MultiLayerNetwork] = trainer.fit()
//...save model

test part:

val fileSystem = FileSystem.get(sparkSession.sparkContext.hadoopConfiguration)
val is = new BufferedInputStream(fileSystem.open(new Path(modelPath)))
val net = ModelSerializer.restoreMultiLayerNetwork(is, true)
val sparkNet = new SparkDl4jMultiLayer(sparkSession.sparkContext, net, null)
val eval: RegressionEvaluation = sparkNet.evaluateRegression(testTsDs)  // error occurred here

Error:

Exception in thread "main" java.lang.NullPointerException
	at org.deeplearning4j.spark.impl.multilayer.SparkDl4jMultiLayer.doEvaluation(SparkDl4jMultiLayer.java:681)
	at org.deeplearning4j.spark.impl.multilayer.SparkDl4jMultiLayer.evaluateRegression(SparkDl4jMultiLayer.java:583)
	at org.deeplearning4j.spark.impl.multilayer.SparkDl4jMultiLayer.evaluateRegression(SparkDl4jMultiLayer.java:571)
	at com.ximalaya.ops.copyright.model.LstmForIncomeV4$.testModel(LstmForIncomeV4.scala:371)
	at com.ximalaya.ops.copyright.model.LstmForIncomeV4$.main(LstmForIncomeV4.scala:100)
	at com.ximalaya.ops.copyright.model.LstmForIncomeV4.main(LstmForIncomeV4.scala)

and the version of DL4J is 1.0.0-beta6, spark version is 2.4.5, on mac OS.

How exactly is your testTsDs defined? This would in principle only happen if treeAggregate on that data returns null.

Sorry, I find my mistake. testTsDs is empty. Sorry to bother you.