Error deserializing MultiLayerConfiguration

When trying to deserialize a model on a tomcat server (OS debian) I get the error message
Error deserializing MultiLayerConfiguration - configuration may have a custom layer, vertex or preprocessor, in pre version 1.0.0-beta JSON format.
even though I do not use any custom layers. What confuses me most, is that the same code runs just fine on Windows. Apparently the class LSTM is not found when deserializing, though it does not cause any problems when creating the layer or training. An MWE would be the following:

public static void main(String[] args) {
		
	try {
		System.out.println("Create net");
		MultiLayerNetwork net = createNet();
			
		System.out.println("Serialize");
		byte[] serializedNet = storeNet(net);
			
		System.out.println("Deserialize");
		net = deserializeNet(serializedNet);
			

	} catch(Exception e) {
		e.printStackTrace();
	}
}

private static MultiLayerNetwork deserializeNet(byte[] serializedNet) throws IOException {
	try (ByteArrayInputStream inputStream = new ByteArrayInputStream(serializedNet)) {
		MultiLayerNetwork model = ModelSerializer.restoreMultiLayerNetwork(inputStream, true);
		return model;
	}
}

private static byte[] storeNet(MultiLayerNetwork net) throws IOException {
	byte[] serializedModel = null;
	try (ByteArrayOutputStream outStream = new ByteArrayOutputStream()) {
		// boolean parameter saveUpdater: whether to save updater of the model
		// updaters are special algorithms for gradient descent
		// we want to be able to update the model so we want to save its updater as well
		boolean saveUpdater = true;
		ModelSerializer.writeModel(net, outStream, saveUpdater);
		serializedModel = outStream.toByteArray();
	}
		
	return serializedModel;
		
}

private static MultiLayerNetwork createNet() {
	NeuralNetConfiguration.Builder builder = new NeuralNetConfiguration.Builder();

	builder.seed(1000);
	builder.biasInit(0.0);
	builder.miniBatch(true);
	builder.updater(new Adam(0.001, 0.9, 0.999, 10E-8));
	builder.weightInit(WeightInit.XAVIER);

	ListBuilder listBuilder = builder.list();
	listBuilder.backpropType(BackpropType.TruncatedBPTT).tBPTTLength(8);

	int layerIndex = 0;

	listBuilder.layer(layerIndex++, new LSTM.Builder()
			.nIn(2)
			.nOut(4)
			.dropOut(new SpatialDropout(0.6))
			.activation(Activation.TANH)
			.build());
		
	listBuilder.layer(layerIndex++, new DenseLayer.Builder()
			.nIn(4)
			.nOut(4)
			.dropOut(0.6) // dropout here does not cause any problems
			.activation(Activation.TANH)
			.build());

	listBuilder.layer(layerIndex++, new RnnOutputLayer.Builder(LossFunction.MSE)
			.activation(Activation.SIGMOID)
			.dropOut(new SpatialDropout(0.6))
			.nIn(4)
			.nOut(2)
			.build());

	// create network
	MultiLayerConfiguration conf = listBuilder.build();
	MultiLayerNetwork net = new MultiLayerNetwork(conf);
	net.init();
	return net;
}

The complete error message is:

java.lang.RuntimeException: Error deserializing MultiLayerConfiguration - configuration may have a custom layer, vertex or preprocessor, in pre version 1.0.0-beta JSON format.
Models in legacy format with custom layers should be loaded in 1.0.0-beta to 1.0.0-beta4 and saved again, before loading in the current version of DL4J
at org.deeplearning4j.nn.conf.MultiLayerConfiguration.fromJson(MultiLayerConfiguration.java:171)
at org.deeplearning4j.util.ModelSerializer.restoreMultiLayerNetworkHelper(ModelSerializer.java:324)
at org.deeplearning4j.util.ModelSerializer.restoreMultiLayerNetwork(ModelSerializer.java:238)
at …SerializationMWE.deserializeNet(Unknown Source)
Caused by: org.nd4j.shade.jackson.databind.exc.InvalidTypeIdException: Could not resolve type id ‘org.deeplearning4j.nn.conf.layers.LSTM’ as a subtype of org.deeplearning4j.nn.conf.layers.Layer: no such class found
at [Source: (String)"{
“backpropType” : “TruncatedBPTT”,
“cacheMode” : “NONE”,
“confs” : [ {
“cacheMode” : “NONE”,
“dataType” : “FLOAT”,
“epochCount” : 0,
“iterationCount” : 0,
“layer” : {
@class” : “org.deeplearning4j.nn.conf.layers.LSTM”,
“activationFn” : {
@class” : “org.nd4j.linalg.activations.impl.ActivationTanH”
},
“biasInit” : 0.0,
“biasUpdater” : null,
“constraints” : null,
“forgetGateBiasInit” : 1.0,
“gainInit” : 1.0,
"[truncated 6536 chars]; line: 10, column: 18] (through reference chain: org.deeplearning4j.nn.conf.MultiLayerConfiguration[“confs”]->java.util.ArrayList[0]->org.deeplearning4j.nn.conf.NeuralNetConfiguration[“layer”])
at org.nd4j.shade.jackson.databind.exc.InvalidTypeIdException.from(InvalidTypeIdException.java:43)
at org.nd4j.shade.jackson.databind.DeserializationContext.invalidTypeIdException(DeserializationContext.java:1761)
at org.nd4j.shade.jackson.databind.DeserializationContext.handleUnknownTypeId(DeserializationContext.java:1268)
at org.nd4j.shade.jackson.databind.jsontype.impl.ClassNameIdResolver._typeFromId(ClassNameIdResolver.java:76)
at org.nd4j.shade.jackson.databind.jsontype.impl.ClassNameIdResolver.typeFromId(ClassNameIdResolver.java:66)
at org.nd4j.shade.jackson.databind.jsontype.impl.TypeDeserializerBase._findDeserializer(TypeDeserializerBase.java:156)
at org.nd4j.shade.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer._deserializeTypedForId(AsPropertyTypeDeserializer.java:113)
at org.nd4j.shade.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer.deserializeTypedFromObject(AsPropertyTypeDeserializer.java:97)
at org.nd4j.shade.jackson.databind.deser.AbstractDeserializer.deserializeWithType(AbstractDeserializer.java:254)
at org.nd4j.shade.jackson.databind.deser.impl.MethodProperty.deserializeAndSet(MethodProperty.java:138)
at org.nd4j.shade.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:288)
at org.nd4j.shade.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:151)
at org.nd4j.shade.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:286)
at org.nd4j.shade.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:245)
at org.nd4j.shade.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:27)
at org.nd4j.shade.jackson.databind.deser.impl.MethodProperty.deserializeAndSet(MethodProperty.java:129)
at org.nd4j.shade.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:288)
at org.nd4j.shade.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:151)
at org.deeplearning4j.nn.conf.serde.MultiLayerConfigurationDeserializer.deserialize(MultiLayerConfigurationDeserializer.java:52)
at org.deeplearning4j.nn.conf.serde.MultiLayerConfigurationDeserializer.deserialize(MultiLayerConfigurationDeserializer.java:42)
at org.nd4j.shade.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4218)
at org.nd4j.shade.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3214)
at org.nd4j.shade.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3182)
at org.deeplearning4j.nn.conf.MultiLayerConfiguration.fromJson(MultiLayerConfiguration.java:160)
… 14 more

In case it helps I use the following dependencies:

<dependency>
	<groupId>org.deeplearning4j</groupId>
	<artifactId>deeplearning4j-core</artifactId>
	<version>${nd4j.version}</version>
</dependency>

<dependency>
	<groupId>org.nd4j</groupId>
	<artifactId>nd4j-cuda-10.2-platform</artifactId>
	<version>${nd4j.version}</version>
	<classifier></classifier>
</dependency>
		
<dependency>
	<groupId>org.nd4j</groupId>
	<artifactId>nd4j-native-platform</artifactId>
	<version>${nd4j.version}</version>
</dependency>

<dependency>
	<groupId>org.nd4j</groupId>
	<artifactId>nd4j-api</artifactId>
	<version>${nd4j.version}</version>
</dependency>

<dependency>
	<groupId>org.nd4j</groupId>
	<artifactId>nd4j-common</artifactId>
	<version>${nd4j.version}</version>	
</dependency>

where the version is the current 1.0.0-beta7

Can you show how you save the model? Are you trying to load a model from an older version of dl4j?

Sorry for the late reply.

I had hoped the code included was sufficient:

	try (ByteArrayInputStream inputStream = new ByteArrayInputStream(serializedNet)) {
		MultiLayerNetwork model = ModelSerializer.restoreMultiLayerNetwork(inputStream, true);
		return model;
	}
}

Here I just use the array this method returns and already that throws the error.

Are you trying to load a model from an older version of dl4j?

No, the model is created in the current version beta7. Also it does not use custom layers as the error message suggest.

We could solve the problem. It turned out that it had nothing to do with DL4J. We had included the dl4j and the nd4j dependencies in different problem and that did not work out properly.