my mergevertex is complaining that it is getting layers with different types as input, it should all be rnn as I am using 1dcnn with an input vector of 483 values.
ComputationGraphConfiguration.GraphBuilder graph = new NeuralNetConfiguration.Builder().seed(seed).activation(Activation.SWISH).optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT).updater(new Adam(0.0003,0.9,0.999,0.1)).weightInit(WeightInit.XAVIER).miniBatch(true)
.cacheMode(CacheMode.NONE).trainingWorkspaceMode(WorkspaceMode.ENABLED).inferenceWorkspaceMode(WorkspaceMode.ENABLED).convolutionMode(ConvolutionMode.Causal).graphBuilder();
graph.setInputTypes(InputType.recurrent(483));
//stem
graph
.addLayer("stem-cnn1",new Convolution1DLayer.Builder(3,2).nIn(483).nOut(32).build(),"input")
.addLayer("stem-batch1", new BatchNormalization.Builder(false).decay(0.995).eps(0.001).nIn(32).nOut(32).build(),"stem-cnn1")
.addLayer("stem-cnn2",new Convolution1DLayer.Builder(3).nIn(32).nOut(32).build(),"stem-batch1")
.addLayer("stem-batch2",new BatchNormalization.Builder(false).decay(0.995).eps(0.001).nIn(32).nOut(32).build(),"stem-cnn2")
.addLayer("stem-cnn3",new Convolution1DLayer.Builder(3).nIn(32).nOut(64).build(),"stem-batch2")
.addLayer("stem-batch3", new BatchNormalization.Builder(false).decay(0.995).eps(0.001).nIn(64).nOut(64).build(), "stem-cnn3")
//left branch
.addLayer("stem-pool1",new Subsampling1DLayer.Builder(Subsampling1DLayer.PoolingType.MAX, 3, 2).build(),"stem-batch3")
//right branch
.addLayer("stem-cnn4",new Convolution1DLayer.Builder(3,2).nIn(64).nOut(96).build(),"stem-batch3")
.addLayer("stem-batch4", new BatchNormalization.Builder(false).decay(0.995).eps(0.001).nIn(96).nOut(96).build(), "stem-cnn4")
//merge
.addVertex("concat1", new MergeVertex(),"stem-pool1", "stem-batch4")
the error>
Invalid input: MergeVertex cannot merge activations of different types: first type = RNN, input type 2 = FF
at org.deeplearning4j.nn.conf.graph.MergeVertex.getOutputType(MergeVertex.java:139)
at org.deeplearning4j.nn.conf.ComputationGraphConfiguration.getLayerActivationTypes(ComputationGraphConfiguration.java:537)
at org.deeplearning4j.nn.conf.ComputationGraphConfiguration.addPreProcessors(ComputationGraphConfiguration.java:450)
at org.deeplearning4j.nn.conf.ComputationGraphConfiguration$GraphBuilder.build(ComputationGraphConfiguration.java:1202)
For some reason the BatchNormalization is using feed forward which is messing things up, i assume BatchNormalization is supported for 1dcnn? Is some preprocessor not getting triggered maybe?
on an unreleated issue
new FileStatsStorage(new File(“/home/workspace/netStats/test1.dat”)); to attch to UI server is not working, it complains there is no file exist, if i create empty text file with same name it complains not valid mapDB database, so how do it create the file in the first place for saving the stats to file? new InMemoryStatsStorage(); works fine
Thanks in advance