Hello, everyone
I’m new to DL4J, and I try to deploy CUDA-Backend for ND4J in pom.xml file on Windows 10.
But, it is NOT workable for missing jnind4jcuda.dll.
I serarch the Maven repository for nd4j-cuda-11.4, and I find that there is no nd4j-cuda-11.4-1.0.0-M2-windows-x86_64.jar existed!
It is NOT released!
So, I wanna know the release date for the entire package.
Thank you!!
PS:
DL4J is a nice open source tool for studying deep learning, best wishes to the new version.
Delphi Tang.
@delphi_tang we’ll get that fixed for the next release. Go ahead and stick with m1.1 for now. The issues are mainly related to the 6 hour build times of github actions. It’s hard to keep the builds reliable there due to the number of components. That was missed during the last release but should be fixed soon. Thanks!
Thank you for your reply. I’m working with 1.0.0-beta6 on cuda9.2 now.
From your reply, my understanding is that the 1.0.0-M1.1 is ok for cuda support via pom.xml configuration, am I right?
Best wishes!
Delphi Tang
Yes for now just use that. That uses cuda 11.x and should be enough for now till M2.1 is released.
Ok!
Thank you very much.
I think it is enough for me to study DL4J by 1.0.0-beta6.
And, do you have any plan on the release date for 1.0.0-M2.1?
@delphi_tang likely next week if all goes well.
Moreover, I have another question about CUDA support.
What is the difference between nd4j-cuda-10.2 , nd4j-cuda-10.2-preset and nd4j-cuda-10.2-platfrom?
And, is there any reference about these basic concept for new comer?
Thank you!
@delphi_tang just stick to the artifacts we have in the docs. Please also do not use cuda 10.2 there. That’s only for jetson nano. On intel based platforms (basically everything else) we only support cuda 11.x. Stick to M1.1 and nothing else. I will not answer questions related to older versions (either due to bugs, or users willingly picking older versions with potential known bugs)
The platform artifact is just for people who don’t want to worry about which platform they use (windows/linux)
Got it! Thank you for your reply.