Read model as InputStream

Hello, i tried the depricated method but it spitting out errors, i am using spring boot and i can not use resource.getFile() when inside jar package. Any suggestions what can i use? Other than creating file from inputstream and reading from it.

Update:
Seems i can’t use your package if i want to jar my spring boot application, is is spitting errors when running my jar(errors from your package)

java.nio.file.NoSuchFileException: /BOOT-INF/lib/deeplearning4j-core-1.0.0-beta6.jar!/ai/skymind
at jdk.zipfs/jdk.nio.zipfs.ZipPath.readAttributes(ZipPath.java:769)
at jdk.zipfs/jdk.nio.zipfs.ZipPath.readAttributes(ZipPath.java:777)
at jdk.zipfs/jdk.nio.zipfs.ZipFileSystemProvider.readAttributes(ZipFileSystemProvider.java:276)
at java.base/java.nio.file.Files.readAttributes(Files.java:1843)
at java.base/java.nio.file.FileTreeWalker.getAttributes(FileTreeWalker.java:219)
at java.base/java.nio.file.FileTreeWalker.visit(FileTreeWalker.java:276)
at java.base/java.nio.file.FileTreeWalker.walk(FileTreeWalker.java:322)
at java.base/java.nio.file.Files.walkFileTree(Files.java:2796)
at java.base/java.nio.file.Files.walkFileTree(Files.java:2876)
at org.nd4j.versioncheck.VersionCheck.listGitPropertiesFiles(VersionCheck.java:230)
at org.nd4j.versioncheck.VersionCheck.getVersionInfos(VersionCheck.java:264)
at org.nd4j.versioncheck.VersionCheck.checkVersions(VersionCheck.java:109)
at org.nd4j.linalg.factory.Nd4j.initWithBackend(Nd4j.java:5141)
at org.nd4j.linalg.factory.Nd4j.initContext(Nd4j.java:5129)
at org.nd4j.linalg.factory.Nd4j.(Nd4j.java:226)
at org.deeplearning4j.models.embeddings.loader.WordVectorSerializer.readBinaryModel(WordVectorSerializer.java:235)

Seems i need to disable version check, how do i do that in spring application?

I’ve been using DL4J with Spring Boot and uberjars for a while without any issues.

How are you packaging your application?

There is only issue with the version check. I do mvn clean install.
is there a solution to loading resource (word2vec binary model) without creating temp file to write to when inside jar? I tried readWord2Vec with inputstream (both false and true options), but it errors out with syn0 = null.

You are talking about two different things. The error you have shared in the stacktrace is on Nd4j initialization. It happens on the first access of Nd4j. You should be able to trigger it with Nd4j.create(1);

And it does happen because you are missing the version information. That is why I am asking how you are packaging your application.

With mvn clean install, created by intellij idea.
Is there any way to read Googles binary model as input stream?

I have version specified in my pom.xml.

Again, your original post has two problems:

  1. Nd4J not even initializing
  2. Trying to read googles binary model from an input stream

So far I’ve been trying to help you with the first problem. And at the moment, I can’t tell if we are making any progress here, as you keep trying to jump to the second one.

For the second problem: No, you can not load google’s binary model from an input stream. WordVectorSerializer.readBinaryModel is for loading W2V in DL4J’s own format. In order to load google’s model, you should be using loadStaticModel. It will take an InputStream, but it will write it to a temporary file first. So technically, it is reading it from a file.

First problem, i am doing it with pom.xml,
without packaging everything is ok, i have versions in my pom.

So what exactly is your problem now?

It is not a problem, but i would like to disable this version check error. It is nothing crucial, as i understand it just checks that dl4j and nlp version are the same. It seems to not see something or not be able to read something

You can not disable it, you have to fix what is causing it in the first place.

The file it is looking for is missing from your uberjar. You have to make sure that it is being included. Start by checking the jar file you are creating. Is it being included?

Yes image

But that isn’t the file you are running, is it? This is the dependency itself.

As you say here, you are using spring boot. So you are either producing a war file, or more likely an uberjar. that contains all your dependencies. Does this file include it?

Include this file that is missing? yes i shown you it on screenshot.

You have not.

What you did was shown me the contents of deeplearning4j-core-1.0.0-beta6.jar.

When you run mvn package create the jar or war file you use for deployment, then you should have a jar file in your target directory. Does the jar file in your target directory contain it?

Yes, i shown you IT in my screenshot, this is the jar inside my uberjar.

Everything works, the only thing is that it outputs this exception, but vectorisation works and all that.

Oh, I apologize. I see that Spring Boot isn’t using regular uber jars any more, and has started to package actual jar files into their main jar file.

I’ll check if there is anything we can do to solve this.

I’ve created an issue to track progress on this: https://github.com/eclipse/deeplearning4j/issues/8944