I tried to load the keras model on spark cluster as follows:
- use importKerasModelAndWeights to load the keras model locally, then save net by ModelSerializer
val path = "xxx/full_model.h5"
val net = KerasModelImport.importKerasModelAndWeights(path, false)
ModelSerializer.writeModel(net,"xxx/serializerModel.zip",false)
- Unzip serializerModel.zip, then upload coefficients.bin to hdfs
- Use ModelSerializer load model on spark cluster
val fileSystem = FileSystem.get(sc.hadoopConfiguration)
val is = new BufferedInputStream(fileSystem.open(new Path(args(0))))
val net = ModelSerializer.restoreComputationGraph(is)
val sparkNet = new SparkComputationGraph(sc, net, tm)
args(0) is the hdfs path of coefficients.bin.
However, when running the program, there is an error in the figure below.