How to optimize textCNN with non-static word2vec

I learned that word vector from word2vec can be fine tuned when training textCNN model, so how to set the params in textCNN example?

You can easily use any of the DL4J Word Vector implementations as the Initialization for an Embedding Layer.

All you have to do is set it as the .weightInit for that layer with:

And EmbeddingInitializer is implemented by all of the Embedding generating classes:

All Known Implementing Classes:
ArrayEmbeddingInitializer, FastText, Glove, Node2Vec, ParagraphVectors, SequenceVectors, SparkParagraphVectors, SparkSequenceVectors, SparkWord2Vec, StaticWord2Vec, Word2Vec, Word2Vec, WordVectorsImpl

Where is this textCNN example? I tried to find it in the docs and on deeplearning4j-examples repo but I couldn’t find it.

@jcabot We deleted a lot of older examples with the most recent release due to lack of polish. You may find the old examples here: GitHub - eclipse/deeplearning4j-examples at ab_beta7