Errors when Trying to import Keras Model into Java

I’m relatively new to DL4J but have some experience with Java. I’m currently trying to import a model in .h5 format using importKerasSequentialModelAndWeights(). The issue is that it fails and I get the following error:

*Exception in thread “main” java.lang.UnsatisfiedLinkError: no jnihdf5 in java.library.path: /Users/maxdeweese/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.

  • at java.base/java.lang.ClassLoader.loadLibrary(ClassLoader.java:2458)*
  • at java.base/java.lang.Runtime.loadLibrary0(Runtime.java:916)*
  • at java.base/java.lang.System.loadLibrary(System.java:2059)*
  • at org.bytedeco.javacpp.Loader.loadLibrary(Loader.java:1800)*
  • at org.bytedeco.javacpp.Loader.load(Loader.java:1402)*
  • at org.bytedeco.javacpp.Loader.load(Loader.java:1214)*
  • at org.bytedeco.javacpp.Loader.load(Loader.java:1190)*
  • at org.bytedeco.hdf5.global.hdf5.(hdf5.java:14)*
  • at java.base/java.lang.Class.forName0(Native Method)*
  • at java.base/java.lang.Class.forName(Class.java:534)*
  • at java.base/java.lang.Class.forName(Class.java:513)*
  • at org.bytedeco.javacpp.Loader.load(Loader.java:1269)*
  • at org.bytedeco.javacpp.Loader.load(Loader.java:1214)*
  • at org.bytedeco.javacpp.Loader.load(Loader.java:1206)*
  • at org.deeplearning4j.nn.modelimport.keras.Hdf5Archive.(Hdf5Archive.java:56)*
  • at org.deeplearning4j.nn.modelimport.keras.utils.KerasModelBuilder.modelHdf5Filename(KerasModelBuilder.java:229)*
  • at org.deeplearning4j.nn.modelimport.keras.KerasModelImport.importKerasSequentialModelAndWeights(KerasModelImport.java:202)*
  • at org.example.Main.load_model(Main.java:48)*
  • at org.example.Main.main(Main.java:28)*
    Caused by: java.lang.UnsatisfiedLinkError: Could not find jnihdf5 in class, module, and library paths.
  • at org.bytedeco.javacpp.Loader.loadLibrary(Loader.java:1767)*
  • … 15 more*

You have everything you need to solve your problem here:
java.lang.UnsatisfiedLinkError: no jnihdf5 in java.library.path: /Users/maxdeweese/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:

Your exception is
java.lang.UnsatisfiedLinkError

It is caused by:
no jnihdf5 in java.library.path

It means that Java cannot find jnihdf5 in any of your java paths.

/Users/maxdeweese/ etc are all the various paths where Java searched for the jnihdf5 library and didnt find it. You have to add the jar file it in at least one of the libraries.

I’d advise you to just add all the Jar files you are using in /usr/lib/java

Thanks for the reply.
I know about having to have this file, the issue is that I don’t know about what type of file this is and where to get it. Could you please help?

@Max I’m guessing you’re missing the hdf5 native binary. You might be running on mac arm64 if you are I’m not sure upstream provides bindings for that platform.

If that’s an issue you only need to convert the model once then you can use it.

Could you show your pom?

Hi @agibsonccc I appreciate the feedback, I do use a newer Macbook Air which uses arm64. Is there a way to find a library or model format/ import that supports most common computers? Here’s my POM file:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>org.example</groupId>
    <artifactId>AniLive</artifactId>
    <version>1.0-SNAPSHOT</version>

    <properties>
        <maven.compiler.source>21</maven.compiler.source>
        <maven.compiler.target>21</maven.compiler.target>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    </properties>
    <dependencies>

        <!-- Deeplearning 4j-->

        <dependency>
            <groupId>org.deeplearning4j</groupId>
            <artifactId>deeplearning4j-core</artifactId>
            <version>1.0.0-M2.1</version>
        </dependency>
        <dependency>
            <groupId>org.deeplearning4j</groupId>
            <artifactId>deeplearning4j-modelimport</artifactId>
            <version>1.0.0-beta2</version>
        </dependency>

        <!-- nd4j -->

        <dependency>
            <groupId>org.nd4j</groupId>
            <artifactId>nd4j-common</artifactId>
            <version>1.0.0-M2.1</version>
        </dependency>
        <dependency>
            <groupId>org.nd4j</groupId>
            <artifactId>nd4j-native-platform</artifactId>
            <version>1.0.0-M2.1</version>
        </dependency>
        <dependency>
            <groupId>org.nd4j</groupId>
            <artifactId>nd4j-native</artifactId>
            <version>1.0.0-M2.1</version>
            <classifier>macosx-arm64</classifier>
        </dependency>

        <!-- OpenBLAS-->

        <dependency>
            <groupId>org.bytedeco</groupId>
            <artifactId>openblas</artifactId>
            <version>0.3.26-1.5.10</version>
        </dependency>
    </dependencies>
</project>

@Max why did you randomly import a 2 year old version of deeplearning4j-model-import? That seems out of place. All versions should be the same.

@agibsonccc I just updated the .POM file to have the newest version of deeplearning4j-model-import; the issue still persists. Could you give me more info on converting the model so I can use it on my machine and how to make it less platform-dependent? Thanks for all the advice as I’m relatively new to Maven and ML in Java.

@Max I gave you what you could try and the likely cause of your problem already. There’s no more info to give. You’re running on an arm mac. What is running underneath to convert that code is not java code. It’s natively compiled c++ code. There aren’t published bindings for that.

Just in case you don’t know what I mean by that. We use something called javacpp to generate bindings for different platforms. Javacpp wraps a number of c++ libraries and publishes java bindings for each platform. We use that for our keras model import as well. That library is the hdf5 library you’re seeing.
Even the latest version only appears to compile for intel platofrms: Central Repository: org/bytedeco/hdf5/1.14.3-1.5.10

That library is a 3rd party dependency we don’t control.

I will only repeat one more time: convert on literally any other computer that has intel based hardware. Model conversion only needs to happen once. Convert the model and just save the result.