I updated my POM to 1.0.0-M1 today and it seems to have broken my code. I’m now getting some backendy errors on INDArray creation via Nd4j.create(someArray):
14:09:47.831 [main] INFO org.nd4j.linalg.factory.Nd4jBackend - Loaded [CpuBackend] backend
14:09:47.831 [main] ERROR org.nd4j.common.config.ND4JClassLoading - Cannot find class [org.nd4j.linalg.jblas.JblasBackend] of provided class-loader.
14:09:47.831 [main] ERROR org.nd4j.common.config.ND4JClassLoading - Cannot find class [org.canova.api.io.data.DoubleWritable] of provided class-loader.
14:09:47.831 [main] ERROR org.nd4j.common.config.ND4JClassLoading - Cannot find class [org.nd4j.linalg.jblas.JblasBackend] of provided class-loader.
14:09:47.831 [main] ERROR org.nd4j.common.config.ND4JClassLoading - Cannot find class [org.canova.api.io.data.DoubleWritable] of provided class-loader.
Warning: Could not load Loader: java.lang.UnsatisfiedLinkError: java.io.FileNotFoundException: C:\Users\wneill\.javacpp\cache\.lock (Access is denied)
That access is denied bit is weird.
Somewhere down the stack, I’m being informed that I might be missing dependencies. Are there new dependencies that I should include for this new version of nd4j?
I’m not sure what JavaCV is to be honest. If it’s like OpenCV for java, then I’m probably not using it. Most of my applications right now are clustering based, so I’m just using nd4j for my work.
I look forward to working with DL4J, especially since I heard I might be able to port my TF and PT models over one day soon.
I’m doing some kaggling on the side to keep my Python fresh. I’ll see if I can’t get some of my PyTorch models ported over.
I think I saw elsewhere that it needs to be converted to ONNX first, which I’ve not messed with before but I’ll give it a shot.
@saudet Also for whatever reason, that .lock file was hidden and I just had to uncheck hide in the file properties on windows. Now the new version is running fine on my machine.
@wcneill just of note, 1.0.0-M1.1 (notice the .1) with some improvements to the helper functionality (it was breaking things for some people) - regarding onnx. Again there’s 2 ways of running it: onnxruntime (which is just for inference) or the new model import framework.
Take a look at the supported ops or send me your model and I would be happy to help you convert it. It’s very easy to extend the framework with annotations as a workaround if something doesn’t work and I would love a real test case.