 # Multilayer Network parameters export

Hello,

I trying to convert DL4J Multilayer Network (MLN) model to my project model structure manually. For that I have to copy MLN’s parameters. For that I have iterated by layers:

``````Arrays.stream(dl4jNetwork.getLayers())
.map(DL4JNeuralNetConverter::convertToNNLayer)
.collect(Collectors.toList());
``````

and for dense layer coping parameters is:

``````return layerParamsTable.get(WEIGHTS_MATRIX_PARAMS_KEY).transpose().toDoubleMatrix();
``````

Where KEY = “W”. Similary for bias vector.

For vice versa convertation I go through my model layers and copy params to the DL4J net directly:

``````private static MultiLayerNetwork copyParams(NNModel from, MultiLayerNetwork to) {
for (int i = 0; i < to.getnLayers(); i++) {
NNLayer fromLayer = from.getNNLayer(i);
if (fromLayer instanceof NNFeedForwardLayer) {
NNFeedForwardLayer ffLayer = (NNFeedForwardLayer) fromLayer;
org.deeplearning4j.nn.api.Layer toLayer = to.getLayer(i);
toLayer.setParam(WEIGHTS_MATRIX_PARAMS_KEY, Nd4j.create(ffLayer.getWeightsMatrix()).transpose());
toLayer.setParam(BIAS_PARAMS_KEY, Nd4j.create(new double[][]{ffLayer.getBiasVector()}).transpose());
}
}
}
``````

The problem is if f(x) == convert function from dl4j to my model and f^-1(x) – vice versa, then x == f^-1(f(x)) → false in params() from equals() method. I check it and got some strange thing: Imgur: The magic of the Internet some values a copied wrong as 1 or 0. Some copied nice.

Probably there exists some more stright-forward way to copy parameters? Probably I have done it in bad manner?

Still debug it, and when found the reason – I’ll write it here.

Thanks!

Oh, found problem as light after posting – sorry! >_<" Forgot at all about batch normalization layer parameters at all!

Probably will usefull note for any person later~