Version `GLIBC_2.23' not found when using dl4j-cuda

I am using dl4j-cuda under the version M1.1 and I am running my application on an HPC with cuda version 11.1. I have got an error: ```
Caused by: java.lang.UnsatisfiedLinkError: /home/h4/nore667e/.javacpp/cache/deepLearningSimpleOne-1.0-SNAPSHOT-jar-with-dependencies.jar/org/nd4j/nativeblas/linux-x86_64/libjnind4jcuda.so: /lib64/libm.so.6: version `GLIBC_2.23’ not found (required by /home/h4/nore667e/.javacpp/cache/deepLearningSimpleOne-1.0-SNAPSHOT-jar-with-dependencies.jar/org/nd4j/nativeblas/linux-x86_64/libnd4jcuda.so)

I tried to install GLIBC 2.23 but it didn't work. My dependencies are: 

<dependency>
            <groupId>org.bytedeco</groupId>
            <artifactId>cuda-platform-redist</artifactId>
            <version>11.2-8.1-1.5.5</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.nd4j</groupId>
            <artifactId>nd4j-cuda-11.2</artifactId>
            <version>1.0.0-M1.1</version>
            <classifier>linux-x86_64-cudnn</classifier>
        </dependency>
        <dependency>
            <groupId>org.nd4j</groupId>
            <artifactId>${nd4j.backend}</artifactId>
            <version>${dl4j-master.version}</version>
        </dependency>
<dependency>
            <groupId>org.deeplearning4j</groupId>
            <artifactId>deeplearning4j-cuda-11.2</artifactId>
            <version>1.0.0-M1.1</version>
        </dependency>

I tried to fix it it has been a while but I didn't succeed. Does anyone knows how to fix it ? 
Thank you!

What OS are you running on for the workers? You need a newer glibc in order to run this. If you are running on a very old system you will not be able to run your workload. I can try to help you figure out a better way to do this but will need more to work with.

These are the features of my OS:
NAME=“CentOS Linux”
VERSION=“7 (Core)”
ID=“centos”
ID_LIKE=“rhel fedora”
VERSION_ID=“7”
PRETTY_NAME=“CentOS Linux 7 (Core)”
ANSI_COLOR=“0;31”
CPE_NAME=“cpe:/o:centos:centos:7”
HOME_URL=“https://www.centos.org/
BUG_REPORT_URL=“https://bugs.centos.org/

CENTOS_MANTISBT_PROJECT=“CentOS-7”
CENTOS_MANTISBT_PROJECT_VERSION=“7”
REDHAT_SUPPORT_PRODUCT=“centos”
REDHAT_SUPPORT_PRODUCT_VERSION=“7”

My current glibc version is 2.17

@Nour-Rekik yeah centos 7 is end of life at this point. You will need to upgrade to 8 in order to run anything.

Thank you so much for your reply. That’s unfortunate cause I already installed GLIBC_2.23 but I still have the same error.
I wanted to know as well if gpu/Cuda is supported by the version beta7? So that I can switch to beta7 since I didn’t have such a problem with beta7.

@Nour-Rekik it is but not that cuda version. That version is also very old and won’t be as performant. Could you give me a full stack trace? If you have the same error it means glibc isn’t being used. You may not have installed it on the workers where this is running. Please double check how you did that.

This is the full error:

Exception in thread "main" java.lang.ExceptionInInitializerError
        at org.nd4j.jita.concurrency.CudaAffinityManager.getNumberOfDevices(CudaAffinityManager.java:136)
        at org.nd4j.jita.constant.ConstantProtector.purgeProtector(ConstantProtector.java:60)
        at org.nd4j.jita.constant.ConstantProtector.<init>(ConstantProtector.java:53)
        at org.nd4j.jita.constant.ConstantProtector.<clinit>(ConstantProtector.java:41)
        at org.nd4j.jita.constant.ProtectedCudaConstantHandler.<clinit>(ProtectedCudaConstantHandler.java:69)
        at org.nd4j.jita.constant.CudaConstantHandler.<clinit>(CudaConstantHandler.java:38)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:348)
        at org.nd4j.common.config.ND4JClassLoading.loadClassByName(ND4JClassLoading.java:62)
        at org.nd4j.common.config.ND4JClassLoading.loadClassByName(ND4JClassLoading.java:56)
        at org.nd4j.linalg.factory.Nd4j.initWithBackend(Nd4j.java:5152)
        at org.nd4j.linalg.factory.Nd4j.initContext(Nd4j.java:5093)
        at org.nd4j.linalg.factory.Nd4j.<clinit>(Nd4j.java:270)
        at org.datavec.image.loader.NativeImageLoader.transformImage(NativeImageLoader.java:670)
        at org.datavec.image.loader.NativeImageLoader.asMatrix(NativeImageLoader.java:593)
        at org.datavec.image.loader.NativeImageLoader.asMatrix(NativeImageLoader.java:281)
        at org.datavec.image.loader.NativeImageLoader.asMatrix(NativeImageLoader.java:256)
        at org.datavec.image.loader.NativeImageLoader.asMatrix(NativeImageLoader.java:250)
        at org.datavec.image.recordreader.BaseImageRecordReader.next(BaseImageRecordReader.java:247)
        at org.datavec.image.recordreader.BaseImageRecordReader.nextRecord(BaseImageRecordReader.java:511)
        at org.deeplearning4j.datasets.datavec.RecordReaderDataSetIterator.initializeUnderlying(RecordReaderDataSetIterator.java:194)
        at org.deeplearning4j.datasets.datavec.RecordReaderDataSetIterator.next(RecordReaderDataSetIterator.java:341)
        at org.deeplearning4j.datasets.datavec.RecordReaderDataSetIterator.next(RecordReaderDataSetIterator.java:421)
        at org.deeplearning4j.datasets.datavec.RecordReaderDataSetIterator.next(RecordReaderDataSetIterator.java:53)
        at com.examples.DeepLearningOnSpark.imageNet_image.streaming.NetworkRetrainingMain.entryPoint(NetworkRetrainingMain.java:55)
        at com.examples.DeepLearningOnSpark.imageNet_image.streaming.NetworkRetrainingMain.main(NetworkRetrainingMain.java:31)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: ND4J is probably missing dependencies. For more information, please refer to: https://deeplearning4j.konduit.ai/nd4j/backend
        at org.nd4j.nativeblas.NativeOpsHolder.<init>(NativeOpsHolder.java:116)
        at org.nd4j.nativeblas.NativeOpsHolder.<clinit>(NativeOpsHolder.java:37)
        ... 38 more
Caused by: java.lang.UnsatisfiedLinkError: no jnind4jcuda in java.library.path
        at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
        at java.lang.Runtime.loadLibrary0(Runtime.java:870)
        at java.lang.System.loadLibrary(System.java:1122)
        at org.bytedeco.javacpp.Loader.loadLibrary(Loader.java:1718)
        at org.bytedeco.javacpp.Loader.load(Loader.java:1328)
        at org.bytedeco.javacpp.Loader.load(Loader.java:1132)
        at org.nd4j.nativeblas.Nd4jCuda.<clinit>(Nd4jCuda.java:10)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:348)
        at org.nd4j.common.config.ND4JClassLoading.loadClassByName(ND4JClassLoading.java:62)
        at org.nd4j.common.config.ND4JClassLoading.loadClassByName(ND4JClassLoading.java:56)
        at org.nd4j.nativeblas.NativeOpsHolder.<init>(NativeOpsHolder.java:88)
        ... 39 more
Caused by: java.lang.UnsatisfiedLinkError: /home/h4/nore667e/.javacpp/cache/deepLearningSimpleOne-1.0-SNAPSHOT-jar-with-dependencies.jar/org/nd4j/nativeblas/linux-x86_64/libjnind4jcuda.so: /lib64/libm.so.6: version `GLIBC_2.23' not found (required by /home/h4/nore667e/.javacpp/cache/deepLearningSimpleOne-1.0-SNAPSHOT-jar-with-dependencies.jar/org/nd4j/nativeblas/linux-x86_64/libnd4jcuda.so)
        at java.lang.ClassLoader$NativeLibrary.load(Native Method)
        at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
        at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
        at java.lang.Runtime.load0(Runtime.java:809)
        at java.lang.System.load(System.java:1086)
        at org.bytedeco.javacpp.Loader.loadLibrary(Loader.java:1668)

I installed GLIBC_2.23 and I export it like that:

export PATH=/scratch/ws/1/s4122485-glibc/lib_new/bin:$PATH

export LD_LIBRARY_PATH=/scratch/ws/1/s4122485-glibc/lib_new:$LD_LIBRARY_PATH
 

Since I am using spark I did the configuration source framework-configure.sh spark $SPARK_HOME/conf and in spark-env.sh I added
export LD_LIBRARY_PATH=/scratch/ws/1/s4122485-glibc/lib_new/:$LD_LIBRARY_PATH
then continue with the start-all.sh

when doing ldd --version it’s GLIBC_2.23

Hello, I’m not an official DL4J person, but I too had a lot of trouble running on Centos7 because of library versions. I couldn’t upgrade some things to the minimum DL4J required versions because it broke other things.

My ultimate solution was to recompile DL4J on my target architecture. It was a lot of work to figure out how to do it, and it takes a long time, but it worked. I had to make changes to DL4J build scripts to suit CENTOS7 as well.

As a rough outline, to compile DL4J I:

  1. Installed the devtoolset kit. I think I went with version 10. ‘yum install devtoolset-10*’
  2. Download and build and install the source code for a new version of cmake. I got it here: Github Kitware/cmake Source
  3. Download, compile, install OpenBlas. I used this source: GitHub - xianyi/OpenBLAS: OpenBLAS is an optimized BLAS library based on GotoBLAS2 1.13 BSD version.
  4. Download and install java 8 and maven
  5. Run the maven build… I don’t have the commands handy as I turned this all into an ansible script. But I based it off of the directions here and it was something like: mvn clean package -DskipTests -Pcpu -Pcuda -Djavapp.cpp=x86_64… If you want, I can find the exact commands

@chris2 of note you don’t actually need openblas when building for cuda.
Regarding your use case here…at least for cpu note that we do have the -compat classifier which builds for very old glibcs. I would suggest you try that at least for cpu.

Do you have a use case where you were running something like an older glibc but using a newer cuda? Do people actually do that? Usually nvcc (the cuda compiler) is hard coded to specific cuda versions. I’m not sure how that lines up with older glibcs.

There’s usually an infinite number of combinations of versions people want.
Internally we’re working on a tool that might help alleviate that but it’s still early yet: GitHub - KonduitAI/kompile: Kompile generates optimized machine learning pipelines usable from python

It’s working on making the modular c++ code base + backend architecture a bit easier for people to deal with while hopefully giving people the knobs to control things they want like binary size, glibc version, cuda version, platform, etc.

If you could elaborate a bit it would help with potentially automating some of the pain points here. We already publish a ton of different classifiers for about every conceivable combination people usually want but I guess that itself still has limitations.

I downgrade the version to beta7 and here the new error I am getting:

Could you help me to solve this please?

@Nour-Rekik could you try building from source like we discussed? I"m not supporting a 2 year old version.

I am understanding from the word “building from source” that I should run maven command: mvn clean package which will create jar-with-dependencies file then submit this jar-with-dependency file using spark right ?

<plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>${maven-shade-plugin.version}</version>
                <configuration>
                    <shadedArtifactAttached>true</shadedArtifactAttached>
                    <shadedClassifierName>${shadedClassifier}</shadedClassifierName>
                    <createDependencyReducedPom>true</createDependencyReducedPom>
                    <filters>
                        <filter>
                            <artifact>*:*</artifact>
                            <excludes>
                                <exclude>org/datanucleus/**</exclude>
                                <exclude>META-INF/*.SF</exclude>
                                <exclude>META-INF/*.DSA</exclude>
                                <exclude>META-INF/*.RSA</exclude>
                            </excludes>
                        </filter>
                    </filters>
                </configuration>
                <executions>
                    <execution>
                        <phase>package</phase>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                        <configuration>
                            <transformers>
                                <transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
                                    <resource>reference.conf</resource>
                                </transformer>
                                <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
                                <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                </transformer>
                            </transformers>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <!--      Added to enable jar creation using mvn command-->

            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>3.3.0</version>
                <configuration>
                    <archive>
                        <manifest>
                            <mainClass>fully.qualified.MainClass</mainClass>
                        </manifest>
                    </archive>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                    <appendAssemblyId>false</appendAssemblyId-->
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <!-- bind to the packaging phase -->
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>

@Nour-Rekik no building from source means manually downloading the code from the deeplearning4j repo and building it yourself.

This does not mean building your code with maven.

This means using c++ compilers, java build tools and using 1.0.0-SNAPSHOT in your version.

In order to do this you will need to figure out what software is already installed on your cluster like the nvcc version, gcc version and the like.
Please do let me know that and follow this guide:

The reason you need to do that is because the source code does support older versions but you have to build it manually. It’s hard to support more than 2 cuda versions for each release. In our open source code and binaries we only support the 2 most recent versions, but you can build the code to match the version you want.

Beyond that feel free to ask for help with that here.

Thank you for the explanation. But I didn’t find clear steps to follow in the guide to build from source. Also, I don’t know if it’s feasible on an HPC cluster. I am sorry but I still don’t get it how to do it.
I have on the HPC cluster:
gcc --version → 8.3.0
and these are CUDA versions available on the cluster:

@Nour-Rekik

What you’ll need to do is

  1. Establish what cuda you have installed. From here I can see cuda 11.4. That’s a good match for something we support. In your case you’ll want to follow the building for cuda section and change it to cuda 11.4.
    For the command you’ll be using maven to install the build. Maven also controls the build process for the c++ library that uses cuda underneath.
    When you initiate the build process you will want to ensure NVCC is on your path.
    Once you ensure that the right cuda folder is first on your path and it finds the right nvcc you can start to build your project.

You’ll also need to set maven up on one of the HPC nodes. Maven installation can be done without being an admin just download from one of these mirrors: https://dlcdn.apache.org/maven/maven-3/3.8.6/binaries/apache-maven-3.8.6-bin.tar.gz
Extract the tar.gz and also add the extracted directorie’s bin to your path.

You’ll need a relevant java version as well. In your case you’ll need a JDK. I would suggest azul: Java Download | Java 8, Java 11, Java 13 - Linux, Windows & macOS You’ll want to download one for centos 7.
Extract the binary from there and set your JAVA_HOME.

Maven needs that in order to use the relevant java compiler.

So in summary:

  1. Clone dl4j
  2. Download maven, jdk
  3. Setup nvcc, mvn executables on your path.
  4. Run ./change-cuda-versions.sh in the root dl4j directory
  5. Run the maven command:
mvn Pcuda-Dlibnd4j.cpu.compile.skip=true  -Djavacpp.platform=linux-x86_64 -Dlibnd4j.chip=cuda  -Pcuda clean install -DskipTests

This will tell dl4j to build cuda. Let me know if you need anything else.

This wil

Okay so now, these are the loaded modules on my HPC cluster:

I cloned dl4j and I changed the cuda version to 11.0 using ./change-cuda-versions.sh (I didn’t put 11.4 because I don’t know its convenient gcc version)

Then I used this command in the root dl4j directory:

nore667e@taurusi8033:~/deeplearning4j> mvn -Pcuda -Dlibnd4j.cpu.compile.skip=true  -Djavacpp.platform=linux-x86_64 -Dlibnd4j.chip=cuda  -Pcuda clean install -DskipTests

I am not sure if I should run it in the root of my project or in the root dl4j directory.
But I have got this error:

@Nour-Rekik I need the whole stack trace. You probably didn’t set your path up properly.

Here the whole stack trace:

nore667e@taurusi8033:~/deeplearning4j> mvn -Pcuda -Dlibnd4j.cpu.compile.skip=true  -Djavacpp.platform=linux-x86_64 -Dlibnd4j.chip=cuda  -Pcuda clean install -DskipTests
[INFO] Scanning for projects...
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for org.nd4j:libnd4j:pom:1.0.0-SNAPSHOT
[WARNING] 'dependencyManagement.dependencies.dependency.(groupId:artifactId:type:classifier)' must be unique: org.lz4:lz4-java:jar -> version 1.8.0 vs ${lz4.version} @ org.deeplearning4j:deeplearning4j:1.0.0-SNAPSHOT, /home/h4/nore667e/deeplearning4j/pom.xml, line 333, column 25
[WARNING] 'dependencyManagement.dependencies.dependency.(groupId:artifactId:type:classifier)' must be unique: io.netty:netty-all:jar -> duplicate declaration of version ${netty.version} @ org.deeplearning4j:deeplearning4j:1.0.0-SNAPSHOT, /home/h4/nore667e/deeplearning4j/pom.xml, line 445, column 25
[WARNING] 'dependencyManagement.dependencies.dependency.(groupId:artifactId:type:classifier)' must be unique: io.netty:netty-buffer:jar -> duplicate declaration of version ${netty.version} @ org.deeplearning4j:deeplearning4j:1.0.0-SNAPSHOT, /home/h4/nore667e/deeplearning4j/pom.xml, line 455, column 25
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for org.deeplearning4j:deeplearning4j:pom:1.0.0-SNAPSHOT
[WARNING] 'dependencyManagement.dependencies.dependency.(groupId:artifactId:type:classifier)' must be unique: org.lz4:lz4-java:jar -> version 1.8.0 vs ${lz4.version} @ line 333, column 25
[WARNING] 'dependencyManagement.dependencies.dependency.(groupId:artifactId:type:classifier)' must be unique: io.netty:netty-all:jar -> duplicate declaration of version ${netty.version} @ line 445, column 25
[WARNING] 'dependencyManagement.dependencies.dependency.(groupId:artifactId:type:classifier)' must be unique: io.netty:netty-buffer:jar -> duplicate declaration of version ${netty.version} @ line 455, column 25
[WARNING] 
[WARNING] It is highly recommended to fix these problems because they threaten the stability of your build.
[WARNING] 
[WARNING] For this reason, future Maven versions might no longer support building such malformed projects.
[WARNING] 
[INFO] Inspecting build with total of 75 modules...
[INFO] Not installing Nexus Staging features:
[INFO]  * Preexisting staging related goal bindings found in 75 modules.
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] deeplearning4j                                                     [pom]
[INFO] libnd4j                                                            [pom]
[INFO] nd4j                                                               [pom]
[INFO] nd4j-shade                                                         [pom]
[INFO] jackson                                                            [jar]
[INFO] protobuf                                                           [jar]
[INFO] guava                                                              [jar]
[INFO] netty-common                                                       [jar]
[INFO] nd4j-common                                                        [jar]
[INFO] nd4j-backends                                                      [pom]
[INFO] nd4j-api-parent                                                    [pom]
[INFO] nd4j-api                                                           [jar]
[INFO] nd4j-common-tests                                                  [jar]
[INFO] nd4j-serde                                                         [pom]
[INFO] nd4j-aeron                                                         [jar]
[INFO] nd4j-arrow                                                         [jar]
[INFO] resources                                                          [jar]
[INFO] nd4j-kryo                                                          [jar]
[INFO] nd4j-backend-impls                                                 [pom]
[INFO] nd4j-presets-common                                                [jar]
[INFO] nd4j-native-api                                                    [jar]
[INFO] nd4j-cuda-preset                                                   [jar]
[INFO] nd4j-cuda                                                          [jar]
[INFO] nd4j-cuda-platform                                                 [jar]
[INFO] nd4j-parameter-server-parent                                       [pom]
[INFO] nd4j-parameter-server-model                                        [jar]
[INFO] nd4j-parameter-server                                              [jar]
[INFO] nd4j-parameter-server-client                                       [jar]
[INFO] nd4j-parameter-server-rocksdb-storage                              [jar]
[INFO] nd4j-parameter-server-node                                         [jar]
[INFO] nd4j-tensorflow                                                    [jar]
[INFO] nd4j-tensorflow-lite                                               [jar]
[INFO] nd4j-onnxruntime                                                   [jar]
[INFO] nd4j-tvm                                                           [jar]
[INFO] samediff-import                                                    [pom]
[INFO] samediff-import-api                                                [jar]
[INFO] samediff-import-onnx                                               [jar]
[INFO] samediff-import-tensorflow                                         [jar]
[INFO] DataVec                                                            [pom]
[INFO] datavec-api                                                        [jar]
[INFO] datavec-data                                                       [pom]
[INFO] datavec-data-image                                                 [jar]
[INFO] datavec-arrow                                                      [jar]
[INFO] python4j-parent                                                    [pom]
[INFO] python4j-core                                                      [jar]
[INFO] python4j-numpy                                                     [jar]
[INFO] datavec-local                                                      [jar]
[INFO] datavec-spark_2.12                                                 [jar]
[INFO] datavec-jdbc                                                       [jar]
[INFO] datavec-excel                                                      [jar]
[INFO] DeepLearning4j                                                     [pom]
[INFO] deeplearning4j-data                                                [pom]
[INFO] deeplearning4j-datavec-iterators                                   [jar]
[INFO] deeplearning4j-datasets                                            [jar]
[INFO] deeplearning4j-utility-iterators                                   [jar]
[INFO] deeplearning4j-common-tests                                        [jar]
[INFO] deeplearning4j-nn                                                  [jar]
[INFO] deeplearning4j-modelimport                                         [jar]
[INFO] deeplearning4j-ui-parent                                           [pom]
[INFO] deeplearning4j-ui-components                                       [jar]
[INFO] deeplearning4j-core                                                [jar]
[INFO] deeplearning4j-ui-model                                            [jar]
[INFO] deeplearning4j-vertx                                               [jar]
[INFO] deeplearning4j-nlp-parent                                          [pom]
[INFO] deeplearning4j-nlp                                                 [jar]
[INFO] deeplearning4j-ui                                                  [jar]
[INFO] DeepLearning4j-scaleout-parent                                     [pom]
[INFO] Spark parent                                                       [pom]
[INFO] dl4j-spark                                                         [jar]
[INFO] deeplearning4j-parallel-wrapper                                    [jar]
[INFO] dl4j-spark-parameterserver                                         [jar]
[INFO] deeplearning4j-scaleout-parallelwrapper-parameter-server           [jar]
[INFO] deeplearning4j-graph                                               [jar]
[INFO] deeplearning4j-zoo                                                 [jar]
[INFO] omnihub                                                            [jar]
[INFO] 
[INFO] -----------------< org.deeplearning4j:deeplearning4j >------------------
[INFO] Building deeplearning4j 1.0.0-SNAPSHOT                            [1/75]
[INFO] --------------------------------[ pom ]---------------------------------
Downloading from google-maven-central: https://maven-central.storage-download.googleapis.com/maven2/org/apache/maven/plugins/maven-source-plugin/3.2.0/maven-source-plugin-3.2.0.pom
Downloaded from google-maven-central: https://maven-central.storage-download.googleapis.com/maven2/org/apache/maven/plugins/maven-source-plugin/3.2.0/maven-source-plugin-3.2.0.pom (5.7 kB at 4.1 kB/s)
Downloading from google-maven-central: https://maven-central.storage-download.googleapis.com/maven2/org/apache/maven/plugins/maven-plugins/33/maven-plugins-33.pom
Downloaded from google-maven-central: https://maven-central.storage-download.googleapis.com/maven2/org/apache/maven/plugins/maven-plugins/33/maven-plugins-33.pom (0 B at 0 B/s)
Downloading from google-maven-central: https://maven-central.storage-download.googleapis.com/maven2/org/apache/maven/plugins/maven-source-plugin/3.2.0/maven-source-plugin-3.2.0.jar
Downloaded from google-maven-central: https://maven-central.storage-download.googleapis.com/maven2/org/apache/maven/plugins/maven-source-plugin/3.2.0/maven-source-plugin-3.2.0.jar (32 kB at 41 kB/s)
Downloading from google-maven-central: https://maven-central.storage-download.googleapis.com/maven2/org/apache/maven/plugins/maven-enforcer-plugin/1.4.1/maven-enforcer-plugin-1.4.1.pom
Downloaded from google-maven-central: https://maven-central.storage-download.googleapis.com/maven2/org/apache/maven/plugins/maven-enforcer-plugin/1.4.1/maven-enforcer-plugin-1.4.1.pom (7.3 kB at 10 kB/s)
.
.
.
Downloading from google-maven-central: https://maven-central.storage-download.googleapis.com/maven2/org/codehaus/plexus/plexus-io/2.0.4/plexus-io-2.0.4.jar
Downloaded from google-maven-central: https://maven-central.storage-download.googleapis.com/maven2/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar (282 kB at 176 kB/s)
Downloading from google-maven-central: https://maven-central.storage-download.googleapis.com/maven2/junit/junit/4.11/junit-4.11.jar
Downloaded from google-maven-central: https://maven-central.storage-download.googleapis.com/maven2/org/apache/maven/enforcer/enforcer-rules/1.4.1/enforcer-rules-1.4.1.jar (99 kB at 60 kB/s)
Downloading from google-maven-central: https://maven-central.storage-download.googleapis.com/maven2/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar
Downloaded from google-maven-central: https://maven-central.storage-download.googleapis.com/maven2/junit/junit/4.11/junit-4.11.jar (0 B at 0 B/s)
Downloaded from google-maven-central: https://maven-central.storage-download.googleapis.com/maven2/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar (0 B at 0 B/s)
Downloaded from google-maven-central: https://maven-central.storage-download.googleapis.com/maven2/org/codehaus/plexus/plexus-i18n/1.0-beta-6/plexus-i18n-1.0-beta-6.jar (12 kB at 6.7 kB/s)
Downloaded from google-maven-central: https://maven-central.storage-download.googleapis.com/maven2/org/apache/maven/plugin-testing/maven-plugin-testing-harness/1.3/maven-plugin-testing-harness-1.3.jar (35 kB at 17 kB/s)
Downloaded from google-maven-central: https://maven-central.storage-download.googleapis.com/maven2/org/codehaus/plexus/plexus-io/2.0.4/plexus-io-2.0.4.jar (58 kB at 24 kB/s)
[INFO] 
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-excluded-dependencies) @ deeplearning4j ---
[INFO] 
[INFO] --- kotlin-maven-plugin:1.4.31:compile (compile) @ deeplearning4j ---
............
[WARNING] No sources found skipping Kotlin compile
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.1:compile (compile) @ deeplearning4j ---
............
[INFO] No sources to compile
[INFO] 
[INFO] --- kotlin-maven-plugin:1.4.31:test-compile (test-compile) @ deeplearning4j ---
[WARNING] No sources found skipping Kotlin compile
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.1:testCompile (testCompile) @ deeplearning4j ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-source-plugin:3.2.0:jar-no-fork (attach-sources) @ deeplearning4j ---
....

[INFO] Installing /home/h4/nore667e/deeplearning4j/pom.xml to /home/nore667e/.m2/repository/org/deeplearning4j/deeplearning4j/1.0.0-SNAPSHOT/deeplearning4j-1.0.0-SNAPSHOT.pom
[INFO] 
[INFO] --------------------------< org.nd4j:libnd4j >--------------------------
[INFO] Building libnd4j 1.0.0-SNAPSHOT                                   [2/75]
[INFO] --------------------------------[ pom ]---------------------------------
.............
[INFO] 
[INFO] --- maven-clean-plugin:2.6:clean (default-clean) @ libnd4j ---
[INFO] 
[INFO] --- maven-clean-plugin:2.6:clean (javacpp-cppbuild-clean) @ libnd4j ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-maven) @ libnd4j ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-excluded-dependencies) @ libnd4j ---
[INFO] 
[INFO] --- javacpp:1.5.7:build (javacpp-cppbuild-validate) @ libnd4j ---
.......
[INFO] Detected platform "linux-x86_64"
[INFO] Building platform "linux-x86_64"
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:cpu-count (get-cpu-count) @ libnd4j ---
......
[INFO] CPU count: 1
[INFO] 
[INFO] --- build-helper-maven-plugin:3.0.0:add-resource (add-resource) @ libnd4j ---
[INFO] 
[INFO] --- kotlin-maven-plugin:1.4.31:compile (compile) @ libnd4j ---
[WARNING] No sources found skipping Kotlin compile
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.1:compile (compile) @ libnd4j ---
[INFO] No sources to compile
[INFO] 
[INFO] --- javacpp:1.5.7:build (javacpp-cppbuild-compile) @ libnd4j ---
[INFO] Skipping execution of JavaCPP Builder
[INFO] 
[INFO] --- javacpp:1.5.7:build (javacpp-cppbuild-compile-cuda) @ libnd4j ---
[INFO] Detected platform "linux-x86_64"
[INFO] Building platform "linux-x86_64"
[INFO] bash buildnativeoperations.sh --build-type release --chip cuda --platform linux-x86_64 --chip-extension '' --chip-version 11.0 --compute '' '' -j 1 -h '' --operations '' --datatypes '' --sanitize OFF --use_lto OFF 
eval cmake
COMPUTE=
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!!                                                                                                           !!
!!                                                                                                           !!
!!                                                                                                           !!
!!                                                                                                           !!
!!                                                 WARNING!                                                  !!
!!                                      No helper packages configured!                                       !!
!!                          You can specify helper by using -h key. I.e. <-h onednn>                         !!
!!                                                                                                           !!
!!                                                                                                           !!
!!                                                                                                           !!
!!                                                                                                           !!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
PACKAGING = none
BUILD = release
CHIP = cuda
ARCH = x86-64
CHIP_EXTENSION = 
CHIP_VERSION = 11.0
GPU_COMPUTE_CAPABILITY = all
EXPERIMENTAL = no
LIBRARY TYPE = dynamic
OPERATIONS = -DSD_ALL_OPS=true
DATATYPES = 
MINIFIER = -DSD_BUILD_MINIFIER=false
TESTS = -DSD_BUILD_TESTS=OFF
NAME = -DSD_LIBRARY_NAME=nd4jcuda
OPENBLAS_PATH = /home/h4/nore667e/.javacpp/cache/openblas-0.3.19-1.5.7-linux-x86_64.jar/org/bytedeco/openblas/linux-x86_64
CHECK_VECTORIZATION = OFF
HELPERS = 
OP_OUTPUT_FILE =include/generated/include_ops.h
USE_LTO=-DSD_USE_LTO=OFF
SANITIZE=OFF
/home/h4/nore667e/deeplearning4j/libnd4j/blasbuild/cuda
buildnativeoperations.sh: line 676: cmake: command not found
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for deeplearning4j 1.0.0-SNAPSHOT:
[INFO] 
[INFO] deeplearning4j ..................................... SUCCESS [01:18 min]
[INFO] libnd4j ............................................ FAILURE [ 15.802 s]
[INFO] nd4j ............................................... SKIPPED
[INFO] nd4j-shade ......................................... SKIPPED
[INFO] jackson ............................................ SKIPPED
[INFO] protobuf ........................................... SKIPPED
[INFO] guava .............................................. SKIPPED
[INFO] netty-common ....................................... SKIPPED
[INFO] nd4j-common ........................................ SKIPPED
[INFO] nd4j-backends ...................................... SKIPPED
[INFO] nd4j-api-parent .................................... SKIPPED
[INFO] nd4j-api ........................................... SKIPPED
[INFO] nd4j-common-tests .................................. SKIPPED
[INFO] nd4j-serde ......................................... SKIPPED
[INFO] nd4j-aeron ......................................... SKIPPED
[INFO] nd4j-arrow ......................................... SKIPPED
[INFO] resources .......................................... SKIPPED
[INFO] nd4j-kryo .......................................... SKIPPED
[INFO] nd4j-backend-impls ................................. SKIPPED
[INFO] nd4j-presets-common ................................ SKIPPED
[INFO] nd4j-native-api .................................... SKIPPED
[INFO] nd4j-cuda-preset ................................... SKIPPED
[INFO] nd4j-cuda .......................................... SKIPPED
[INFO] nd4j-cuda-platform ................................. SKIPPED
[INFO] nd4j-parameter-server-parent ....................... SKIPPED
[INFO] nd4j-parameter-server-model ........................ SKIPPED
[INFO] nd4j-parameter-server .............................. SKIPPED
[INFO] nd4j-parameter-server-client ....................... SKIPPED
[INFO] nd4j-parameter-server-rocksdb-storage .............. SKIPPED
[INFO] nd4j-parameter-server-node ......................... SKIPPED
[INFO] nd4j-tensorflow .................................... SKIPPED
[INFO] nd4j-tensorflow-lite ............................... SKIPPED
[INFO] nd4j-onnxruntime ................................... SKIPPED
[INFO] nd4j-tvm ........................................... SKIPPED
[INFO] samediff-import .................................... SKIPPED
[INFO] samediff-import-api ................................ SKIPPED
[INFO] samediff-import-onnx ............................... SKIPPED
[INFO] samediff-import-tensorflow ......................... SKIPPED
[INFO] DataVec ............................................ SKIPPED
[INFO] datavec-api ........................................ SKIPPED
[INFO] datavec-data ....................................... SKIPPED
[INFO] datavec-data-image ................................. SKIPPED
[INFO] datavec-arrow ...................................... SKIPPED
[INFO] python4j-parent .................................... SKIPPED
[INFO] python4j-core ...................................... SKIPPED
[INFO] python4j-numpy ..................................... SKIPPED
[INFO] datavec-local ...................................... SKIPPED
[INFO] datavec-spark_2.12 ................................. SKIPPED
[INFO] datavec-jdbc ....................................... SKIPPED
[INFO] datavec-excel ...................................... SKIPPED
[INFO] DeepLearning4j ..................................... SKIPPED
[INFO] deeplearning4j-data ................................ SKIPPED
[INFO] deeplearning4j-datavec-iterators ................... SKIPPED
[INFO] deeplearning4j-datasets ............................ SKIPPED
[INFO] deeplearning4j-utility-iterators ................... SKIPPED
[INFO] deeplearning4j-common-tests ........................ SKIPPED
[INFO] deeplearning4j-nn .................................. SKIPPED
[INFO] deeplearning4j-modelimport ......................... SKIPPED
[INFO] deeplearning4j-ui-parent ........................... SKIPPED
[INFO] deeplearning4j-ui-components ....................... SKIPPED
[INFO] deeplearning4j-core ................................ SKIPPED
[INFO] deeplearning4j-ui-model ............................ SKIPPED
[INFO] deeplearning4j-vertx ............................... SKIPPED
[INFO] deeplearning4j-nlp-parent .......................... SKIPPED
[INFO] deeplearning4j-nlp ................................. SKIPPED
[INFO] deeplearning4j-ui .................................. SKIPPED
[INFO] DeepLearning4j-scaleout-parent ..................... SKIPPED
[INFO] Spark parent ....................................... SKIPPED
[INFO] dl4j-spark ......................................... SKIPPED
[INFO] deeplearning4j-parallel-wrapper .................... SKIPPED
[INFO] dl4j-spark-parameterserver ......................... SKIPPED
[INFO] deeplearning4j-scaleout-parallelwrapper-parameter-server SKIPPED
[INFO] deeplearning4j-graph ............................... SKIPPED
[INFO] deeplearning4j-zoo ................................. SKIPPED
[INFO] omnihub ............................................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  01:38 min
[INFO] Finished at: 2022-08-31T12:54:14+02:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.bytedeco:javacpp:1.5.7:build (javacpp-cppbuild-compile-cuda) on project libnd4j: Execution javacpp-cppbuild-compile-cuda of goal org.bytedeco:javacpp:1.5.7:build failed: Process exited with an error: 127 -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <args> -rf :libnd4j

@Nour-Rekik ah you need to install cmake as well. Sorry about that. Go here:
https://cmake.org/download/

Dl4j needs a newer cmake than this guide:

but it does contain some other comments around this topic in case you’re wondering. I would recommend following the general spirit of this guide but with the most recent cmake 3.24. You may need other things in order for this to work.

Note you only need to do this once.

Ensure cmake is in your path similar to the other steps above. This also shouldn’t need admin privileges.

Try to read the errors. If you look right above the error there you’ll see cmake not found which is how I was able to tell you what to do next.

Thank you. So I searched installed CMake modules in the HPC and I found version 3.23 and I upgrade cuda version to 11.4.
Here the installed modules:

But now I have another error:

-- Configuring done
-- Generating done
-- Build files have been written to: /home/h4/nore667e/deeplearning4j/libnd4j/blasbuild/cuda/flatbuffers-download
[ 11%] Creating directories for 'flatbuffers'
[ 22%] Performing download step (git clone) for 'flatbuffers'
Cloning into 'flatbuffers-src'...
Already on 'main'
[ 33%] Performing update step for 'flatbuffers'
HEAD is now at 31b78f8... Add 1.12.1 frozen fork
[ 44%] No patch step for 'flatbuffers'
[ 55%] No configure step for 'flatbuffers'
[ 66%] No build step for 'flatbuffers'
[ 77%] No install step for 'flatbuffers'
[ 88%] No test step for 'flatbuffers'
[100%] Completed 'flatbuffers'
[100%] Built target flatbuffers
CMake Warning at blasbuild/cuda/flatbuffers-src/CMakeLists.txt:43 (message):
  Cannot build tests without building the compiler.  Tests will be disabled.


-- Looking for strtof_l
-- Looking for strtof_l - found
-- Looking for strtoull_l
-- Looking for strtoull_l - found
fatal: No names found, cannot describe anything.
Adding all ops due to empty op list or SD_ALL_OPS definition: SD ALL OPS: true SD_OPS_LIST:  
definitions will be written to "../include/generated/include_ops.h"
Building x86_64 binary...
Build cublas
CUDA include directory: /sw/installed/CUDA/11.4.2/include
CUDA found!
Enabling fPIC...
-- Configuring done
CMake Warning (dev) in blas/CMakeLists.txt:
  Policy CMP0104 is not set: CMAKE_CUDA_ARCHITECTURES now detected for NVCC,
  empty CUDA_ARCHITECTURES not allowed.  Run "cmake --help-policy CMP0104"
  for policy details.  Use the cmake_policy command to set the policy and
  suppress this warning.

  CUDA_ARCHITECTURES is empty for target "samediff_obj".
This warning is for project developers.  Use -Wno-dev to suppress it.

CMake Warning (dev) in blas/CMakeLists.txt:
  Policy CMP0104 is not set: CMAKE_CUDA_ARCHITECTURES now detected for NVCC,
  empty CUDA_ARCHITECTURES not allowed.  Run "cmake --help-policy CMP0104"
  for policy details.  Use the cmake_policy command to set the policy and
  suppress this warning.

  CUDA_ARCHITECTURES is empty for target "nd4jcuda".
This warning is for project developers.  Use -Wno-dev to suppress it.

-- Generating done
CMake Warning:
  Manually-specified variables were not used by the project:

    BLAS
    CMAKE_NEED_RESPONSE
    DEV
    MKL_MULTI_THREADED
    PACKAGING
    SD_BUILD_MINIFIER


-- Build files have been written to: /home/h4/nore667e/deeplearning4j/libnd4j/blasbuild/cuda
/software/rome/CMake/3.23.1-GCCcore-11.3.0/bin/cmake -S/home/h4/nore667e/deeplearning4j/libnd4j -B/home/h4/nore667e/deeplearning4j/libnd4j/blasbuild/cuda --check-build-system CMakeFiles/Makefile.cmake 0
/software/rome/CMake/3.23.1-GCCcore-11.3.0/bin/cmake -E cmake_progress_start /home/h4/nore667e/deeplearning4j/libnd4j/blasbuild/cuda/CMakeFiles /home/h4/nore667e/deeplearning4j/libnd4j/blasbuild/cuda//CMakeFiles/progress.marks
make  -f CMakeFiles/Makefile2 all
make[1]: Entering directory `/home/h4/nore667e/deeplearning4j/libnd4j/blasbuild/cuda'
make  -f blas/CMakeFiles/samediff_obj.dir/build.make blas/CMakeFiles/samediff_obj.dir/depend
make[2]: Entering directory `/home/h4/nore667e/deeplearning4j/libnd4j/blasbuild/cuda'
cd /home/h4/nore667e/deeplearning4j/libnd4j/blasbuild/cuda && /software/rome/CMake/3.23.1-GCCcore-11.3.0/bin/cmake -E cmake_depends "Unix Makefiles" /home/h4/nore667e/deeplearning4j/libnd4j /home/h4/nore667e/deeplearning4j/libnd4j/blas /home/h4/nore667e/deeplearning4j/libnd4j/blasbuild/cuda /home/h4/nore667e/deeplearning4j/libnd4j/blasbuild/cuda/blas /home/h4/nore667e/deeplearning4j/libnd4j/blasbuild/cuda/blas/CMakeFiles/samediff_obj.dir/DependInfo.cmake --color=
make[2]: Leaving directory `/home/h4/nore667e/deeplearning4j/libnd4j/blasbuild/cuda'
make  -f blas/CMakeFiles/samediff_obj.dir/build.make blas/CMakeFiles/samediff_obj.dir/build
make[2]: Entering directory `/home/h4/nore667e/deeplearning4j/libnd4j/blasbuild/cuda'
[  1%] Building CUDA object blas/CMakeFiles/samediff_obj.dir/__/include/loops/cuda/broadcasting.cu.o
cd /home/h4/nore667e/deeplearning4j/libnd4j/blasbuild/cuda/blas && /sw/installed/CUDA/11.4.2/bin/nvcc -forward-unknown-to-host-compiler -DF_X64=true -DSD_ALL_OPS=1 -DSD_CUDA=true -D__CUDABLAS__=true -I/home/h4/nore667e/deeplearning4j/libnd4j/blasbuild/cuda/flatbuffers-src/include -I/home/h4/nore667e/deeplearning4j/libnd4j/blasbuild/cuda/include -I/home/h4/nore667e/deeplearning4j/libnd4j/include -I/sw/installed/CUDA/11.4.2/include -I/usr/local/include -Xcompiler=-fPIC -Xcompiler=-std=gnu++11 -DCUDA_VERSION_MAJOR=11  -w --cudart=static --expt-extended-lambda -Xfatbin -compress-all -gencode arch=compute_53,code=sm_53 -gencode arch=compute_60,code=sm_60 -gencode arch=compute_61,code=sm_61 -gencode arch=compute_70,code=sm_70 -gencode arch=compute_75,code=sm_75 -gencode arch=compute_80,code=sm_80 -gencode arch=compute_86,code=sm_86 -gencode arch=compute_86,code=compute_86 -O3 -DNDEBUG -std=c++11 -MD -MT blas/CMakeFiles/samediff_obj.dir/__/include/loops/cuda/broadcasting.cu.o -MF CMakeFiles/samediff_obj.dir/__/include/loops/cuda/broadcasting.cu.o.d -x cu -c /home/h4/nore667e/deeplearning4j/libnd4j/include/loops/cuda/broadcasting.cu -o CMakeFiles/samediff_obj.dir/__/include/loops/cuda/broadcasting.cu.o
/software/rome/GCCcore/11.3.0/include/c++/11.3.0/bits/std_function.h:435:145: error: parameter packs not expanded with ‘...’:
  435 |         function(_Functor&& __f)
      |                                                                                                                                                 ^ 
/software/rome/GCCcore/11.3.0/include/c++/11.3.0/bits/std_function.h:435:145: note:         ‘_ArgTypes’
/software/rome/GCCcore/11.3.0/include/c++/11.3.0/bits/std_function.h:530:146: error: parameter packs not expanded with ‘...’:
  530 |         operator=(_Functor&& __f)
      |                                                                                                                                                  ^ 
/software/rome/GCCcore/11.3.0/include/c++/11.3.0/bits/std_function.h:530:146: note:         ‘_ArgTypes’
make[2]: *** [blas/CMakeFiles/samediff_obj.dir/__/include/loops/cuda/broadcasting.cu.o] Error 1
make[2]: Leaving directory `/home/h4/nore667e/deeplearning4j/libnd4j/blasbuild/cuda'
make[1]: *** [blas/CMakeFiles/samediff_obj.dir/all] Error 2
make[1]: Leaving directory `/home/h4/nore667e/deeplearning4j/libnd4j/blasbuild/cuda'
make: *** [all] Error 2
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for deeplearning4j 1.0.0-SNAPSHOT:
[INFO] 
[INFO] deeplearning4j ..................................... SUCCESS [  6.758 s]
[INFO] libnd4j ............................................ FAILURE [ 53.953 s]
[INFO] nd4j ............................................... SKIPPED
[INFO] nd4j-shade ......................................... SKIPPED
[INFO] jackson ............................................ SKIPPED
[INFO] protobuf ........................................... SKIPPED
[INFO] guava .............................................. SKIPPED
[INFO] netty-common ....................................... SKIPPED
[INFO] nd4j-common ........................................ SKIPPED
[INFO] nd4j-backends ...................................... SKIPPED
[INFO] nd4j-api-parent .................................... SKIPPED
[INFO] nd4j-api ........................................... SKIPPED
[INFO] nd4j-common-tests .................................. SKIPPED
[INFO] nd4j-serde ......................................... SKIPPED
[INFO] nd4j-aeron ......................................... SKIPPED
[INFO] nd4j-arrow ......................................... SKIPPED
[INFO] resources .......................................... SKIPPED
[INFO] nd4j-kryo .......................................... SKIPPED
[INFO] nd4j-backend-impls ................................. SKIPPED
[INFO] nd4j-presets-common ................................ SKIPPED
[INFO] nd4j-native-api .................................... SKIPPED
[INFO] nd4j-cuda-preset ................................... SKIPPED
[INFO] nd4j-cuda .......................................... SKIPPED
[INFO] nd4j-cuda-platform ................................. SKIPPED
[INFO] nd4j-parameter-server-parent ....................... SKIPPED
[INFO] nd4j-parameter-server-model ........................ SKIPPED
[INFO] nd4j-parameter-server .............................. SKIPPED
[INFO] nd4j-parameter-server-client ....................... SKIPPED
[INFO] nd4j-parameter-server-rocksdb-storage .............. SKIPPED
[INFO] nd4j-parameter-server-node ......................... SKIPPED
[INFO] nd4j-tensorflow .................................... SKIPPED
[INFO] nd4j-tensorflow-lite ............................... SKIPPED
[INFO] nd4j-onnxruntime ................................... SKIPPED
[INFO] nd4j-tvm ........................................... SKIPPED
[INFO] samediff-import .................................... SKIPPED
[INFO] samediff-import-api ................................ SKIPPED
[INFO] samediff-import-onnx ............................... SKIPPED
[INFO] samediff-import-tensorflow ......................... SKIPPED
[INFO] DataVec ............................................ SKIPPED
[INFO] datavec-api ........................................ SKIPPED
[INFO] datavec-data ....................................... SKIPPED
[INFO] datavec-data-image ................................. SKIPPED
[INFO] datavec-arrow ...................................... SKIPPED
[INFO] python4j-parent .................................... SKIPPED
[INFO] python4j-core ...................................... SKIPPED
[INFO] python4j-numpy ..................................... SKIPPED
[INFO] datavec-local ...................................... SKIPPED
[INFO] datavec-spark_2.12 ................................. SKIPPED
[INFO] datavec-jdbc ....................................... SKIPPED
[INFO] datavec-excel ...................................... SKIPPED
[INFO] DeepLearning4j ..................................... SKIPPED
[INFO] deeplearning4j-data ................................ SKIPPED
[INFO] deeplearning4j-datavec-iterators ................... SKIPPED
[INFO] deeplearning4j-datasets ............................ SKIPPED
[INFO] deeplearning4j-utility-iterators ................... SKIPPED
[INFO] deeplearning4j-common-tests ........................ SKIPPED
[INFO] deeplearning4j-nn .................................. SKIPPED
[INFO] deeplearning4j-modelimport ......................... SKIPPED
[INFO] deeplearning4j-ui-parent ........................... SKIPPED
[INFO] deeplearning4j-ui-components ....................... SKIPPED
[INFO] deeplearning4j-core ................................ SKIPPED
[INFO] deeplearning4j-ui-model ............................ SKIPPED
[INFO] deeplearning4j-vertx ............................... SKIPPED
[INFO] deeplearning4j-nlp-parent .......................... SKIPPED
[INFO] deeplearning4j-nlp ................................. SKIPPED
[INFO] deeplearning4j-ui .................................. SKIPPED
[INFO] DeepLearning4j-scaleout-parent ..................... SKIPPED
[INFO] Spark parent ....................................... SKIPPED
[INFO] dl4j-spark ......................................... SKIPPED
[INFO] deeplearning4j-parallel-wrapper .................... SKIPPED
[INFO] dl4j-spark-parameterserver ......................... SKIPPED
[INFO] deeplearning4j-scaleout-parallelwrapper-parameter-server SKIPPED
[INFO] deeplearning4j-graph ............................... SKIPPED
[INFO] deeplearning4j-zoo ................................. SKIPPED
[INFO] omnihub ............................................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  01:07 min
[INFO] Finished at: 2022-08-31T16:19:35+02:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.bytedeco:javacpp:1.5.7:build (javacpp-cppbuild-compile-cuda) on project libnd4j: Execution javacpp-cppbuild-compile-cuda of goal org.bytedeco:javacpp:1.5.7:build failed: Process exited with an error: 2 -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <args> -rf :libnd4j