Error in training -> Invalid input: expect CNN activations with rank 4

I’m trying to train a CNN for some suit images I have, but without success. I have a csv file with 988 lines and each line has 401 columns, where the last column represents the expected suit. The images in this array are in 20x20 size and are in grayscale. I configured 7 layers this way:

        return new NeuralNetConfiguration.Builder()
                .seed(1611)
                .iterations(7)
                .regularization(false).l2(0.005)
                .activation(Activation.RELU)
                .learningRate(0.0001)
                .weightInit(WeightInit.XAVIER)
                .optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
                .updater(new Nesterovs(0.9))
                .list()
                .layer(0, new Convolution1DLayer.Builder().name("cl1").kernelSize(5).padding(1).nIn(400).nOut(100).build())
                .layer(1, new Subsampling1DLayer.Builder().name("pl1").kernelSize(2).padding(1).build())
                .layer(2, new Convolution1DLayer.Builder().name("cl2").kernelSize(5).padding(1).nOut(100).build())
                .layer(3, new Subsampling1DLayer.Builder().name("pl2").kernelSize(2).padding(1).build())
                .layer(4, new DenseLayer.Builder().name("dense").nOut(100).build())
                .layer(5, new OutputLayer.Builder(LossFunctions.LossFunction.NEGATIVELOGLIKELIHOOD).name("output")
                        .nOut(4)
                        .activation(Activation.SOFTMAX)
                        .build())
                .backprop(true).pretrain(false)
                .setInputType(InputType.convolutional(20, 20, 1))
                .build();

Every time I try train this CNN I get this error message:

Exception in thread "main" java.lang.IllegalArgumentException: Invalid input: expect CNN activations with rank 4 (received input with shape [988, 400])
	at org.deeplearning4j.nn.conf.preprocessor.CnnToRnnPreProcessor.preProcess(CnnToRnnPreProcessor.java:56)
	at org.deeplearning4j.nn.multilayer.MultiLayerNetwork.activationFromPrevLayer(MultiLayerNetwork.java:788)
	at org.deeplearning4j.nn.multilayer.MultiLayerNetwork.feedForwardToLayer(MultiLayerNetwork.java:929)
	at org.deeplearning4j.nn.multilayer.MultiLayerNetwork.computeGradientAndScore(MultiLayerNetwork.java:2224)
	at org.deeplearning4j.optimize.solvers.BaseOptimizer.gradientAndScore(BaseOptimizer.java:174)
	at org.deeplearning4j.optimize.solvers.StochasticGradientDescent.optimize(StochasticGradientDescent.java:60)
	at org.deeplearning4j.optimize.Solver.optimize(Solver.java:53)
	at org.deeplearning4j.nn.multilayer.MultiLayerNetwork.fit(MultiLayerNetwork.java:1780)
	at org.deeplearning4j.nn.multilayer.MultiLayerNetwork.fit(MultiLayerNetwork.java:1729)
	at org.deeplearning4j.nn.multilayer.MultiLayerNetwork.fit(MultiLayerNetwork.java:1832)
	at com.brevleq.poker.dude.SuitTrainer.train(SuitTrainer.java:51)
	at com.brevleq.poker.dude.SuitTrainer.main(SuitTrainer.java:40)

Could someone explain what is wrong?

@brevleq you’re passing in 2d input in to a network that expects 4d. You’ll need to reshape your input.
From the looks of it though…you’re using CNN 1d? Sorry for that not being intuitive but you’ll probably want to use InputType.recurrent instead since that’s for time series data.

CNN input is mainly for images.

Regarding the rest…I’m guessing you followed a really old tutorial. Please get rid of backprop(true).pretrain(false) and iterations(7)

Stick to the defaults there. Use a for loop + fit(…) to build your network.

Thanks for your answer.
Should I prepare my entire dataset again? This dataset was created and used to train a simple multilayer network:

return new NeuralNetConfiguration.Builder()
                .iterations(1000)
                .activation(Activation.SIGMOID)
                .weightInit(WeightInit.XAVIER)
                .learningRate(0.05)
                .regularization(true).l2(0.0001)
                .list()
                .layer(0, new DenseLayer.Builder().nIn(LABEL_INDEX).nOut(100).build())
                .layer(1, new DenseLayer.Builder().nIn(100).nOut(100).build())
                .layer(2, new DenseLayer.Builder().nIn(100).nOut(100).build())
                .layer(3, new OutputLayer.Builder(
                        LossFunctions.LossFunction.NEGATIVELOGLIKELIHOOD)
                        .activation(Activation.SOFTMAX)
                        .nIn(100).nOut(SUITS_QUANTITY).build())
                .backprop(true).pretrain(false)
                .build();

As I did not achieved a good percentage of match, I decided try a CNN. My current dataset consists of 20x20 binary (black and white) images. Here are two lines of dataset:

0.7631764705882352,0.7631764705882352,0.7631764705882352,0.7670588235294117,0.7670588235294117,0.7631764705882352,0.7592941176470588,0.7631764705882352,0.7709411764705882,0.7709411764705882,0.7903529411764706,0.7709411764705882,0.7592941176470588,0.7592941176470588,0.7670588235294117,0.7670588235294117,0.7592941176470588,0.7554117647058823,0.7554117647058823,0.7592941176470588,0.7787058823529412,0.7748235294117647,0.7748235294117647,0.7670588235294117,0.7631764705882352,0.7670588235294117,0.7670588235294117,0.7631764705882352,0.7592941176470588,0.7864705882352941,0.7670588235294117,0.7476470588235294,0.7592941176470588,0.7592941176470588,0.7631764705882352,0.7592941176470588,0.7554117647058823,0.7515294117647059,0.7592941176470588,0.7631764705882352,0.7670588235294117,0.7670588235294117,0.7709411764705882,0.7825882352941177,0.7709411764705882,0.7592941176470588,0.7670588235294117,0.7554117647058823,0.7670588235294117,0.6350588235294118,0.4292941176470588,0.7670588235294117,0.7515294117647059,0.7631764705882352,0.7592941176470588,0.7592941176470588,0.7554117647058823,0.7592941176470588,0.7631764705882352,0.7631764705882352,0.7825882352941177,0.7825882352941177,0.7670588235294117,0.7787058823529412,0.7631764705882352,0.736,0.7787058823529412,0.7592941176470588,0.7825882352941177,0.3205882352941177,0.3438823529411765,0.7243529411764705,0.794235294117647,0.7981176470588235,0.7631764705882352,0.7670588235294117,0.7592941176470588,0.7631764705882352,0.7631764705882352,0.7592941176470588,0.7787058823529412,0.7825882352941177,0.7825882352941177,0.8175294117647058,0.7981176470588235,0.7864705882352941,0.7787058823529412,0.7748235294117647,0.47200000000000003,0.35552941176470587,0.3594117647058824,0.39435294117647063,0.7748235294117647,0.7321176470588235,0.7592941176470588,0.7709411764705882,0.7670588235294117,0.7709411764705882,0.7670588235294117,0.7670588235294117,0.7981176470588235,0.8097647058823529,0.7864705882352941,0.8175294117647058,0.7903529411764706,0.7554117647058823,0.7515294117647059,0.4914117647058823,0.3516470588235294,0.3632941176470588,0.3632941176470588,0.3710588235294118,0.5807058823529412,0.8097647058823529,0.7748235294117647,0.7748235294117647,0.7631764705882352,0.7709411764705882,0.7709411764705882,0.7709411764705882,0.8291764705882353,0.7864705882352941,0.7825882352941177,0.860235294117647,0.8252941176470587,0.736,0.5418823529411765,0.2856470588235294,0.3283529411764706,0.3283529411764706,0.3361176470588235,0.31670588235294117,0.3244705882352941,0.6389411764705882,0.7825882352941177,0.8175294117647058,0.7592941176470588,0.7825882352941177,0.7709411764705882,0.7748235294117647,0.8485882352941176,0.8136470588235294,0.802,0.8175294117647058,0.860235294117647,0.4875294117647059,0.3283529411764706,0.3244705882352941,0.3361176470588235,0.35552941176470587,0.3438823529411765,0.3438823529411765,0.347764705882353,0.30505882352941177,0.5496470588235294,0.8175294117647058,0.7864705882352941,0.7787058823529412,0.7787058823529412,0.7787058823529412,0.8058823529411765,0.8175294117647058,0.7825882352941177,0.7709411764705882,0.4098823529411765,0.4370588235294118,0.33999999999999997,0.3205882352941177,0.3128235294117647,0.3128235294117647,0.2934117647058823,0.3128235294117647,0.3322352941176471,0.28952941176470587,0.3361176470588235,0.4098823529411765,0.7903529411764706,0.7787058823529412,0.794235294117647,0.7709411764705882,0.7592941176470588,0.7981176470588235,0.6816470588235295,0.47200000000000003,0.4875294117647059,0.4331764705882353,0.25458823529411767,0.347764705882353,0.30505882352941177,0.3205882352941177,0.3205882352941177,0.3244705882352941,0.30505882352941177,0.30505882352941177,0.3361176470588235,0.36717647058823527,0.3011764705882353,0.6156470588235294,0.7631764705882352,0.8097647058823529,0.7903529411764706,0.4292941176470588,0.5690588235294117,0.45647058823529413,0.49529411764705883,0.30505882352941177,0.3244705882352941,0.28952941176470587,0.3283529411764706,0.2972941176470589,0.28952941176470587,0.3128235294117647,0.2972941176470589,0.28952941176470587,0.3128235294117647,0.347764705882353,0.3205882352941177,0.3438823529411765,0.347764705882353,0.8136470588235294,0.7864705882352941,0.8136470588235294,0.6583529411764706,0.3244705882352941,0.3089411764705882,0.3089411764705882,0.2934117647058823,0.30505882352941177,0.3244705882352941,0.3244705882352941,0.3205882352941177,0.3089411764705882,0.2972941176470589,0.33999999999999997,0.347764705882353,0.3244705882352941,0.3244705882352941,0.5845882352941177,0.7748235294117647,0.7748235294117647,0.8058823529411765,0.8136470588235294,0.8175294117647058,0.802,0.4875294117647059,0.3089411764705882,0.3089411764705882,0.3205882352941177,0.3322352941176471,0.31670588235294117,0.3205882352941177,0.3283529411764706,0.3438823529411765,0.3361176470588235,0.3322352941176471,0.4137647058823529,0.802,0.7864705882352941,0.794235294117647,0.794235294117647,0.8214117647058823,0.8175294117647058,0.8058823529411765,0.8136470588235294,0.8369411764705883,0.6467058823529412,0.25070588235294117,0.33999999999999997,0.3322352941176471,0.30505882352941177,0.3438823529411765,0.36717647058823527,0.3361176470588235,0.3244705882352941,0.538,0.8214117647058823,0.8291764705882353,0.7981176470588235,0.8058823529411765,0.8058823529411765,0.8097647058823529,0.8058823529411765,0.8097647058823529,0.794235294117647,0.8058823529411765,0.8175294117647058,0.6699999999999999,0.31670588235294117,0.30505882352941177,0.3361176470588235,0.3516470588235294,0.3632941176470588,0.39435294117647063,0.5574117647058824,0.7787058823529412,0.7748235294117647,0.7981176470588235,0.794235294117647,0.7903529411764706,0.7903529411764706,0.8097647058823529,0.8097647058823529,0.8097647058823529,0.8097647058823529,0.802,0.8175294117647058,0.8330588235294119,0.464235294117647,0.33999999999999997,0.3710588235294118,0.3594117647058824,0.3438823529411765,0.5030588235294118,0.8058823529411765,0.8175294117647058,0.8136470588235294,0.7864705882352941,0.794235294117647,0.8058823529411765,0.8058823529411765,0.8136470588235294,0.8136470588235294,0.8136470588235294,0.8097647058823529,0.8097647058823529,0.794235294117647,0.8097647058823529,0.8058823529411765,0.3710588235294118,0.4021176470588235,0.3594117647058824,0.33999999999999997,0.7825882352941177,0.8408235294117647,0.8136470588235294,0.7981176470588235,0.8058823529411765,0.8097647058823529,0.802,0.802,0.8175294117647058,0.8175294117647058,0.8175294117647058,0.8175294117647058,0.8136470588235294,0.8214117647058823,0.8214117647058823,0.8175294117647058,0.7592941176470588,0.347764705882353,0.3788235294117647,0.7476470588235294,0.8136470588235294,0.8097647058823529,0.8097647058823529,0.802,0.8097647058823529,0.802,0.802,0.7981176470588235,0.8175294117647058,0.8175294117647058,0.8214117647058823,0.8214117647058823,0.8175294117647058,0.8214117647058823,0.8136470588235294,0.8136470588235294,0.8097647058823529,0.5069411764705882,0.5069411764705882,0.8058823529411765,0.8097647058823529,0.8097647058823529,0.8058823529411765,0.8058823529411765,0.8136470588235294,0.8097647058823529,0.8097647058823529,0.8058823529411765,0.8252941176470587,0.8252941176470587,0.8252941176470587,0.8214117647058823,0.8214117647058823,0.8175294117647058,0.8136470588235294,0.8097647058823529,0.8214117647058823,0.7981176470588235,0.7981176470588235,0.8175294117647058,0.8136470588235294,0.8175294117647058,0.8136470588235294,0.8136470588235294,0.8175294117647058,0.8214117647058823,0.8175294117647058,0.8175294117647058,1

0.01776470588235294,0.01,0.02164705882352941,0.01,0.01388235294117647,0.01,0.01776470588235294,0.01,0.01388235294117647,0.025529411764705884,0.01,0.02164705882352941,0.02164705882352941,0.01776470588235294,0.01,0.01,0.01,0.01388235294117647,0.01388235294117647,0.01,0.01388235294117647,0.01,0.01388235294117647,0.01,0.01388235294117647,0.01388235294117647,0.01,0.02164705882352941,0.01776470588235294,0.029411764705882353,0.11870588235294117,0.029411764705882353,0.01,0.01388235294117647,0.02164705882352941,0.01,0.01,0.01388235294117647,0.01388235294117647,0.01,0.01,0.01776470588235294,0.01,0.01776470588235294,0.025529411764705884,0.01388235294117647,0.01,0.02164705882352941,0.025529411764705884,0.30505882352941177,0.5768235294117646,0.041058823529411766,0.01776470588235294,0.029411764705882353,0.01,0.025529411764705884,0.01,0.01388235294117647,0.01388235294117647,0.01388235294117647,0.01,0.01388235294117647,0.029411764705882353,0.01,0.01388235294117647,0.01,0.01,0.025529411764705884,0.08376470588235294,0.5574117647058824,0.538,0.4292941176470588,0.025529411764705884,0.01388235294117647,0.025529411764705884,0.01,0.01388235294117647,0.01388235294117647,0.01388235294117647,0.01388235294117647,0.01388235294117647,0.01,0.01388235294117647,0.01388235294117647,0.01776470588235294,0.025529411764705884,0.01776470588235294,0.03329411764705882,0.5612941176470588,0.5341176470588236,0.5496470588235294,0.5535294117647059,0.2701176470588236,0.01776470588235294,0.01,0.025529411764705884,0.01388235294117647,0.01388235294117647,0.01,0.01,0.01,0.01776470588235294,0.01,0.01388235294117647,0.01388235294117647,0.01776470588235294,0.025529411764705884,0.5302352941176471,0.5418823529411765,0.538,0.5535294117647059,0.5185882352941176,0.5496470588235294,0.1575294117647059,0.01388235294117647,0.01776470588235294,0.01776470588235294,0.01,0.01,0.01,0.01,0.01,0.01388235294117647,0.01776470588235294,0.01,0.029411764705882353,0.4603529411764706,0.538,0.538,0.5535294117647059,0.5418823529411765,0.5496470588235294,0.5496470588235294,0.5574117647058824,0.11094117647058822,0.02164705882352941,0.01776470588235294,0.01388235294117647,0.01,0.01388235294117647,0.01776470588235294,0.01776470588235294,0.01776470588235294,0.01388235294117647,0.044941176470588234,0.5185882352941176,0.5341176470588236,0.5418823529411765,0.538,0.538,0.5418823529411765,0.5341176470588236,0.538,0.5496470588235294,0.5574117647058824,0.1847058823529412,0.01776470588235294,0.01388235294117647,0.01,0.01776470588235294,0.01,0.029411764705882353,0.02164705882352941,0.09152941176470587,0.5768235294117646,0.5418823529411765,0.5535294117647059,0.538,0.5418823529411765,0.5418823529411765,0.5457647058823529,0.5341176470588236,0.5341176470588236,0.5457647058823529,0.538,0.5574117647058824,0.3322352941176471,0.03329411764705882,0.01388235294117647,0.01388235294117647,0.02164705882352941,0.02164705882352941,0.3128235294117647,0.5574117647058824,0.5302352941176471,0.5535294117647059,0.538,0.538,0.5302352941176471,0.5535294117647059,0.5341176470588236,0.5535294117647059,0.5457647058823529,0.5341176470588236,0.5496470588235294,0.5302352941176471,0.5341176470588236,0.4914117647058823,0.09541176470588235,0.03329411764705882,0.01388235294117647,0.1264705882352941,0.5729411764705882,0.5263529411764706,0.5496470588235294,0.5341176470588236,0.538,0.5496470588235294,0.5535294117647059,0.5341176470588236,0.5457647058823529,0.5263529411764706,0.5535294117647059,0.5302352941176471,0.538,0.5418823529411765,0.5535294117647059,0.5457647058823529,0.35552941176470587,0.01,0.02164705882352941,0.01388235294117647,0.0371764705882353,0.42152941176470593,0.5496470588235294,0.538,0.5418823529411765,0.5341176470588236,0.5418823529411765,0.5341176470588236,0.5457647058823529,0.5457647058823529,0.538,0.5418823529411765,0.5341176470588236,0.5418823529411765,0.5651764705882353,0.10705882352941176,0.01776470588235294,0.02164705882352941,0.02164705882352941,0.01,0.01,0.0371764705882353,0.24682352941176472,0.5496470588235294,0.5341176470588236,0.5457647058823529,0.5341176470588236,0.5418823529411765,0.5457647058823529,0.5457647058823529,0.5418823529411765,0.5341176470588236,0.5341176470588236,0.4875294117647059,0.0371764705882353,0.01,0.029411764705882353,0.01776470588235294,0.01,0.01388235294117647,0.01776470588235294,0.01776470588235294,0.02164705882352941,0.14200000000000002,0.5807058823529412,0.5302352941176471,0.538,0.5457647058823529,0.5341176470588236,0.538,0.5418823529411765,0.5496470588235294,0.3788235294117647,0.029411764705882353,0.025529411764705884,0.01776470588235294,0.01,0.025529411764705884,0.01388235294117647,0.01,0.02164705882352941,0.01,0.02164705882352941,0.01776470588235294,0.11094117647058822,0.5612941176470588,0.538,0.5302352941176471,0.5496470588235294,0.5496470588235294,0.5302352941176471,0.3904705882352941,0.0371764705882353,0.01,0.01,0.01,0.01,0.01,0.01,0.01,0.01776470588235294,0.01388235294117647,0.01776470588235294,0.025529411764705884,0.02164705882352941,0.19247058823529414,0.5418823529411765,0.5457647058823529,0.5341176470588236,0.5535294117647059,0.47200000000000003,0.02164705882352941,0.01776470588235294,0.01,0.01,0.01,0.025529411764705884,0.01,0.01,0.01,0.01388235294117647,0.01388235294117647,0.01776470588235294,0.01776470588235294,0.01776470588235294,0.025529411764705884,0.3283529411764706,0.5457647058823529,0.5418823529411765,0.5807058823529412,0.03329411764705882,0.01776470588235294,0.01388235294117647,0.01,0.01,0.01,0.01,0.01,0.01388235294117647,0.01388235294117647,0.01388235294117647,0.01388235294117647,0.01388235294117647,0.01388235294117647,0.01388235294117647,0.01776470588235294,0.03329411764705882,0.48364705882352943,0.5612941176470588,0.09929411764705881,0.01388235294117647,0.01,0.01776470588235294,0.01,0.01,0.01,0.01,0.01,0.01388235294117647,0.01388235294117647,0.01388235294117647,0.01388235294117647,0.01388235294117647,0.01,0.01,0.01388235294117647,0.029411764705882353,0.08764705882352941,0.35552941176470587,0.025529411764705884,0.01,0.02164705882352941,0.01,0.029411764705882353,0.01,0.01,0.01,0.01,0.01,0.01,0.01388235294117647,0.01388235294117647,0.01,0.01,0.01388235294117647,0.01776470588235294,0.01,0.02164705882352941,0.03329411764705882,0.01388235294117647,0.01776470588235294,0.02164705882352941,0.01,0.01,0.01,0.01,0.01,0.01,1

Can`t I reutilize this dataset? Which kind of image I need to train a CNN??

@brevleq I’m a bit confused…now you’re posting about a 2d network. DenseLayers only take 2d input in.
I need you to stay focused on one problem and not randomly changing networks on me. Now I don’t know what may or may not have changed in your problem.

Given the randomness here my response is going to just be standard: make sure you understand what kind of shapes each kind of layer can take in. Do not set your nIn on each layer manually. You should be using setInputType to do that. We do that for you in the examples (just search depending on what you want to do)
Lastly, whatever it is you decide to do understand your label shapes as well. Are they 2d classification? Time series? I can’t tell you what that is unless you define a problem for yourself.

If you can clearly tell me without changing anything what it is you WANT to do I will tell you how to do it.