Is there a Flatten layer in DL4J?

Does DL4J have a flatten layer?

@minyueyong just ensure your pre processor is set. Internally in dl4j we manipulate vectors using a pre processors. You can see an example from keras here: deeplearning4j/ at 96790ea7ce13d72b918a2ce244468b7d4d2c711a · eclipse/deeplearning4j · GitHub

Basically don’t worry about it just make sure you call setInputType and we’ll inject the flatten where it’s needed. If you don’t know what I"m referencing please look at some of the examples: Search · setInputType · GitHub

In our samediff framework (think tensorflow/pytorch for lower level control) we do have a flatten operation (which is also used by the higher level dl4j networks)

So I have input and label with shape [ 7 , 4 ]

I am using Embedding Sequence layer which will return 3D array and it will get error when passing it into Dense Layer because Dense Layer only accept 2D array as input

I was wondering if I am putting the setInputType at the correct place??

@minyueyong yes pre processors get inserted in the network when you call that setInputType. II notice you’re using an embedding layer first though. You should use InputType.rnn(…) first.

You’re not thinking about setInputType correctly. You don’t need to manually do anything just tell the configuration what your expected input type is and then as needed depending on what your layers are things like flatten will be inserted automatically if they are needed in the network.

Just get rid of the thinking that you need to manually flatten and just declare your layers.

I have added setInputType(InputType.recurrent(50)) before the Embedding Sequence Layer . May I ask why do I have to put it before Embedding Sequence Layer instead of the Dense Layer?

Now I have another error Labels and preOutput must have equal shapes: got shapes [7, 4] vs [28, 4]

@minyueyong the embedding sequence layer expects RNN like input so make sure it’s 3d input even if it’s 1 time step. Read up more on all this here: Recurrent Layers - Deeplearning4j