How to optimize textCNN with non-static word2vec

I learned that word vector from word2vec can be fine tuned when training textCNN model, so how to set the params in textCNN example?

You can easily use any of the DL4J Word Vector implementations as the Initialization for an Embedding Layer.

All you have to do is set it as the .weightInit for that layer with:
https://deeplearning4j.org/api/latest/org/deeplearning4j/nn/weights/embeddings/WeightInitEmbedding.html

And EmbeddingInitializer is implemented by all of the Embedding generating classes:

All Known Implementing Classes:
ArrayEmbeddingInitializer, FastText, Glove, Node2Vec, ParagraphVectors, SequenceVectors, SparkParagraphVectors, SparkSequenceVectors, SparkWord2Vec, StaticWord2Vec, Word2Vec, Word2Vec, WordVectorsImpl