How to load keras model on spark cluster

I tried to load the keras model on spark cluster as follows:

  1. use importKerasModelAndWeights to load the keras model locally, then save net by ModelSerializer
val path = "xxx/full_model.h5"
val net = KerasModelImport.importKerasModelAndWeights(path, false)
ModelSerializer.writeModel(net,"xxx/serializerModel.zip",false)
  1. Unzip serializerModel.zip, then upload coefficients.bin to hdfs
  2. Use ModelSerializer load model on spark cluster
val fileSystem = FileSystem.get(sc.hadoopConfiguration)
val is = new BufferedInputStream(fileSystem.open(new Path(args(0))))
val net = ModelSerializer.restoreComputationGraph(is)
val sparkNet = new SparkComputationGraph(sc, net, tm)

args(0) is the hdfs path of coefficients.bin.

However, when running the program, there is an error in the figure below.

That is the problem. The way you try to restore it expects to get a Zip file.

The coefficents.bin is also not enough to restore the model. There is a reason there are multiple things in that zip file. The coefficents file alone contains only the weights, but no information about the actual structure of the network.

I try to save model like this:

ModelSerializer.writeModel(net,"xxx/serializerModel.bin",false)

then, upload serializerModel.bin to hdfs.
But, there is an error in the figure below.

This would only happen if you use a newer version to convert the model, and then try to load it in an older version.

OK, Thank you very much.