Could not create a descriptor for a dilated convolution forward propagation primitive

When I try to train my model build in computationGraph, it show the error like this:
Caused by: java.lang.RuntimeException: could not create a descriptor for a dilated convolution forward propagation primitive
How can I sovle it?

I change the backend to cuda and then the problem sovled.

I’m not sure this is a “solution” per se :slight_smile: could you at least tell us what happened here if there’s a problem with our cpu backend we should look in to it if possible.

It seems that if the graph includes a conv1d layer with dilation config, the error will appear.

@xianxuan could you post a sample config here that reproduces your issue or file an issue if you didn’t already? If you did, could you link it here? Thanks!

graph
                .addLayer(nameLayer(blockName, "cnn1"),
                        new Convolution1DLayer.Builder(kernelSize)
                                .dilation(dilation)
                                .convolutionMode(ConvolutionMode.Causal).nOut(channel).build(),
                        input)
                .addLayer(nameLayer(blockName, "relu1"),
                        new ActivationLayer.Builder().activation(Activation.RELU).build(),
                        nameLayer(blockName, "cnn1"))
                .addLayer(nameLayer(blockName, "dropout1"),
                        new DropoutLayer.Builder(dropout).build(),
                        nameLayer(blockName, "relu1"))
                .addLayer(nameLayer(blockName, "cnn2"),
                        new Convolution1DLayer.Builder(kernelSize)
                                .dilation(dilation)
                                .convolutionMode(ConvolutionMode.Causal).nIn(channel).nOut(channel).build(),
                        nameLayer(blockName, "dropout1"))
                .addLayer(nameLayer(blockName, "relu2"),
                        new ActivationLayer.Builder().activation(Activation.RELU).build(),
                        nameLayer(blockName, "cnn2"))
                .addLayer(nameLayer(blockName, "dropout2"),
                        new DropoutLayer.Builder(dropout).build(),
                        nameLayer(blockName, "relu2"));
 graph.addInputs(input).setInputTypes(InputType.recurrent(featureSize, timeSteps))
            .addLayer("crop", new Cropping1D.Builder(timeSteps-1,0).build(),nameLayer(blockName, "relu2"))
            .addLayer("outputLayer",
                    new OutputLayer.Builder(LossFunctions.LossFunction.NEGATIVELOGLIKELIHOOD)
                            .activation(Activation.SOFTMAX)
                            .nOut(outputSize)
                            .weightInit(WeightInit.XAVIER).build(),
                    "crop")
            .setOutputs("outputLayer");
1 Like