I am trying to install the examples for the Deep Learning book and it has not been going well. I have at least got to the point where I have a working Java jdk which Maven uses. I’ll skip that misery and focus on the following.
results in a compile failure with this error. I have not the slightest idea how to fix it.
[WARNING] Rule 1: org.apache.maven.plugins.enforcer.RequireFilesExist failed with message:
!!! You have to compile libnd4j with cpu support first!
Some required files are missing:
/home/amon/DeepLearning/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-native/…/…/…/…/libnd4j/blas/NativeOps.h
/home/amon/DeepLearning/nd4j/nd4j-backends/nd4j-backend-impls/nd4j-native/…/…/…/…/libnd4j/blasbuild/cpu/blas
Try to stick to using maven. I’m not sure why you would have the need to build from source. That involves c code and installing a bunch of tools java devs generally aren’t familiar with. If you don’t need to go through the hassle, I recommend against it.
Actually, that is all I did. git and maven and the maven build
fails.
Just so you know, I’m actually a very experienced C/ObjC coder
and what I know about Java is… well I read the book 10 years
ago… Never wrote a line of it in my life, but the syntax is
pretty straightforward.
If something is broken in the build process as documented in
Appendix G, I’m happy enough to work through it if you’ve got
my back. It will be slow going because this is a side project and
I have a lot of systems engineering taking precedence of my time.
Right so it’s not about whether you’re good at c or not, it’s the fact you don’t need to compile from source and the book examples are out of date. You can import the maintained, updated examples that I highlighted in to intellij or eclipse as a maven project and it will work fine.
There’s also the matter of the steps of setting java and java native interface.
The combination of things makes it hairy. Like I said, there is zero need to compile anything from source which is what you’re attempting to do here. Nd4j is a JNI based tool on top of a c++ library we maintain.
That’s a cmake project. You’re injecting a bunch of un needed complexity in to your workflow here when instead you could just have maven download the pre baked jars that already exist.
From your description here, I think the delta is being new to maven and the associated IDE tooling in java. My suggestion would be:
Learn how to use maven properly. It’s the industry standard in java and has been around more than 10 years. It’s what almost every open source project uses. We try to be as off the shelf as we can with respect to what’s done within the java language universe.
We’re in the middle of improving it and welcome feedback. Please use this thread and we will help you as best as we can.
I apologize about your first getting started experience here, but most of the complexity comes from being unfamiliar with maven and is not really specific to the project itself.
I hope to get back to this over the weekend. I suspect the first thing to do is nuke all the downloads and working directories from the first two build attempts, just so they don’t get in the way when I try to do it differently.
Actually, I have an hour before I have to run my weekly management videocon, so I decided to do the build right now. Seems to be going well, but there is an error in quickstart. It has the wrong directory for the cd, ie ‘cd dl4j-examples/’ but it should be ‘cd deeplearning4j-examples/’
I selected the deeplearning4j directory from IntelliJ and it brought up a panel with a bottom bar checking dependencies and a main section showing the README… and none of it seems to have any relationship to the quickstart document which has as the next step “Choose ‘Import project from external model’ and ensure that Maven is selected”. Perhaps after it finishes Resolving dependencies it will show me something that relates back to the instructions?
Are you still attempting to build from source for some reason? The examples and deeplearning4j master are fundamentally incompatible. Deeplearning4j master has about…1 year + of changes.
Why not just import the examples and run as is? It there some objective you have?
Intellij with its maven support will even download and install the deps for you.
If you have some reason to download the source and work with it I’m happy to help but if you want to just run the examples, you’re doing something un needed.
Simply following the instructions in the howto. I downloaded IntelliJ from the specified site, installed it, picked the path to the directory where the code had been installed in the earlier part of the instructions, and that is what I got.
Once the project is open, it will download all necessary dependencies. Depending on your Internet connection this can take a while, because the examples cover a lot of possible use cases.
Still in IntelliJ, open up the dl4j-examples folder, and go to src/main/java/org/deeplearning4j/examples/feedforward/classificationMLPClassifierLinear.java and run it by clicking the green run triangle left of the class name and selecting "Run MLPClassifierLinear.main()`.
Within a few seconds you should see a window opening showing you the results (that is specific to this example, and not something that DL4J does)
as well as the following output on the console:
o.n.l.f.Nd4jBackend - Loaded [CpuBackend] backend
o.n.n.NativeOpsHolder - Number of threads used for linear algebra: 4
o.n.l.c.n.CpuNDArrayFactory - *********************************** CPU Feature Check Warning ***********************************
o.n.l.c.n.CpuNDArrayFactory - Warning: Initializing ND4J with Generic x86 binary on a CPU with AVX/AVX2 support
o.n.l.c.n.CpuNDArrayFactory - Using ND4J with AVX/AVX2 will improve performance. See deeplearning4j.org/cpu for more details
o.n.l.c.n.CpuNDArrayFactory - Or set environment variable ND4J_IGNORE_AVX=true to suppress this warning
o.n.l.c.n.CpuNDArrayFactory - *************************************************************************************************
o.n.n.Nd4jBlas - Number of threads used for OpenMP BLAS: 4
o.n.l.a.o.e.DefaultOpExecutioner - Backend used: [CPU]; OS: [Windows 10]
o.n.l.a.o.e.DefaultOpExecutioner - Cores: [8]; Memory: [8,0GB];
o.n.l.a.o.e.DefaultOpExecutioner - Blas vendor: [OPENBLAS]
o.d.n.m.MultiLayerNetwork - Starting MultiLayerNetwork with WorkspaceModes set to [training: ENABLED; inference: ENABLED], cacheMode set to [NONE]
o.d.o.l.ScoreIterationListener - Score at iteration 0 is 0.49040191650390624
o.d.o.l.ScoreIterationListener - Score at iteration 10 is 0.5371029281616211
o.d.o.l.ScoreIterationListener - Score at iteration 20 is 0.45942108154296873
o.d.o.l.ScoreIterationListener - Score at iteration 30 is 0.4248866271972656
o.d.o.l.ScoreIterationListener - Score at iteration 40 is 0.35806324005126955
o.d.o.l.ScoreIterationListener - Score at iteration 50 is 0.334898681640625
o.d.o.l.ScoreIterationListener - Score at iteration 60 is 0.28348138809204104
o.d.o.l.ScoreIterationListener - Score at iteration 70 is 0.25856330871582034
o.d.o.l.ScoreIterationListener - Score at iteration 80 is 0.2237623405456543
o.d.o.l.ScoreIterationListener - Score at iteration 90 is 0.20081315994262694
o.d.o.l.ScoreIterationListener - Score at iteration 100 is 0.17935667037963868
o.d.o.l.ScoreIterationListener - Score at iteration 110 is 0.15812795639038085
o.d.o.l.ScoreIterationListener - Score at iteration 120 is 0.1466454029083252
o.d.o.l.ScoreIterationListener - Score at iteration 130 is 0.12723068237304688
o.d.o.l.ScoreIterationListener - Score at iteration 140 is 0.12246221542358399
o.d.o.l.ScoreIterationListener - Score at iteration 150 is 0.10459692001342774
o.d.o.l.ScoreIterationListener - Score at iteration 160 is 0.1043481159210205
o.d.o.l.ScoreIterationListener - Score at iteration 170 is 0.08775687217712402
o.d.o.l.ScoreIterationListener - Score at iteration 180 is 0.09047154426574706
o.d.o.l.ScoreIterationListener - Score at iteration 190 is 0.07497384071350098
o.d.o.l.ScoreIterationListener - Score at iteration 200 is 0.07967714309692382
o.d.o.l.ScoreIterationListener - Score at iteration 210 is 0.06506465911865235
o.d.o.l.ScoreIterationListener - Score at iteration 220 is 0.07103242874145507
o.d.o.l.ScoreIterationListener - Score at iteration 230 is 0.05724238395690918
o.d.o.l.ScoreIterationListener - Score at iteration 240 is 0.06402899742126465
o.d.o.l.ScoreIterationListener - Score at iteration 250 is 0.050963459014892576
o.d.o.l.ScoreIterationListener - Score at iteration 260 is 0.05824000835418701
o.d.o.l.ScoreIterationListener - Score at iteration 270 is 0.04582953453063965
o.d.o.l.ScoreIterationListener - Score at iteration 280 is 0.05342156887054443
o.d.o.l.ScoreIterationListener - Score at iteration 290 is 0.04158044815063477
o.d.o.l.ScoreIterationListener - Score at iteration 300 is 0.04932633399963379
o.d.o.l.ScoreIterationListener - Score at iteration 310 is 0.038008933067321775
o.d.o.l.ScoreIterationListener - Score at iteration 320 is 0.0458131217956543
o.d.o.l.ScoreIterationListener - Score at iteration 330 is 0.03496615409851074
o.d.o.l.ScoreIterationListener - Score at iteration 340 is 0.04275306701660156
o.d.o.l.ScoreIterationListener - Score at iteration 350 is 0.032343897819519046
o.d.o.l.ScoreIterationListener - Score at iteration 360 is 0.040063905715942386
o.d.o.l.ScoreIterationListener - Score at iteration 370 is 0.03006138801574707
o.d.o.l.ScoreIterationListener - Score at iteration 380 is 0.037681238651275636
o.d.o.l.ScoreIterationListener - Score at iteration 390 is 0.02809156894683838
o.d.o.l.ScoreIterationListener - Score at iteration 400 is 0.0355769944190979
o.d.o.l.ScoreIterationListener - Score at iteration 410 is 0.02637479066848755
o.d.o.l.ScoreIterationListener - Score at iteration 420 is 0.03370037794113159
o.d.o.l.ScoreIterationListener - Score at iteration 430 is 0.024857771396636964
o.d.o.l.ScoreIterationListener - Score at iteration 440 is 0.03200906991958618
o.d.o.l.ScoreIterationListener - Score at iteration 450 is 0.023510570526123046
o.d.o.l.ScoreIterationListener - Score at iteration 460 is 0.030486409664154054
o.d.o.l.ScoreIterationListener - Score at iteration 470 is 0.022306795120239257
o.d.o.l.ScoreIterationListener - Score at iteration 480 is 0.02910691738128662
o.d.o.l.ScoreIterationListener - Score at iteration 490 is 0.02122408151626587
o.d.o.l.ScoreIterationListener - Score at iteration 500 is 0.02785438060760498
o.d.o.l.ScoreIterationListener - Score at iteration 510 is 0.020247113704681397
o.d.o.l.ScoreIterationListener - Score at iteration 520 is 0.0267118239402771
o.d.o.l.ScoreIterationListener - Score at iteration 530 is 0.019360045194625853
o.d.o.l.ScoreIterationListener - Score at iteration 540 is 0.02565845727920532
o.d.o.l.ScoreIterationListener - Score at iteration 550 is 0.018552312850952147
o.d.o.l.ScoreIterationListener - Score at iteration 560 is 0.024690260887145998
o.d.o.l.ScoreIterationListener - Score at iteration 570 is 0.017813940048217774
o.d.o.l.ScoreIterationListener - Score at iteration 580 is 0.023794214725494384
o.d.o.l.ScoreIterationListener - Score at iteration 590 is 0.017136224508285523
Evaluate model....
========================Evaluation Metrics========================
# of classes: 2
Accuracy: 1,0000
Precision: 1,0000
Recall: 1,0000
F1 Score: 1,0000
Precision, recall & F1: reported for positive class (class 1 - "1") only
=========================Confusion Matrix=========================
0 1
-------
100 0 | 0 = 0
0100 | 1 = 1
Confusion matrix format: Actual (rowClass) predicted as (columnClass) N times
==================================================================
****************Example finished********************
As @agibsonccc already said, the development of DL4J continues at a fast pace, so unfortunately the book is a bit outdated already. Also the book was written before DL4J was an Eclipse project, so a few links there are also outdated.
Since you’ve been so kind to respond, I thought I’d give you some
idea of what I am actually up to.
I’m using the Deep Leaning book essentially as a text book for
the whole field of Deep Learning. The DL4J software is for me
essentially the set of ‘Practicals’ to insure I have understood
the material. I am planning on two full readings (I am up to Ch9 on
the first reading) I intend to go through code and run examples
during the second reading as if I were taking a class.
I did my grad studies in AI/CogSci under Dr. Simon at CMU a very
long time ago and had a little familiarity with Perceptrons and
Neicer’s work, even had to build a threshold gate out of transistors
for an EE lab. Much later I worked with Dr. Minsky on a Nanotech
proposal… you know, the guy who caused this whole field to
die for a decade!
What I am seeing in your book is very interesting, in particular
with the style that feeds back to generate output. That made me
sit up. Things like “Introspection”, “Consciousness”, “Dreaming”,
“Sleep States” came to mind. I can see very deep ties with those
and the way they work in people and how evolution playing out
with neuron neural nets might create them.
Now as to why I am going to the trouble of studying the field.
One is a desire to tie in new developments to update my
knowledge; the other is the possibility of some applications
in my own aerospace area.
Basically, I not particularly interested in Java, but I am very
interested in DL4J as a study and learning tool for the field in
general.
Both the field and DL4J have been progressing quite fast since the book was released.
The code examples in the book may not work exactly the same anymore, so make sure to also keep an eye on the official documentation (we are working on improving it, but creating “better” instead of “more” takes some time) and the examples.
@amon appreciate the feedback. I’m glad you are going to use it as a textbook. The basic architectures are still largely untouched and should work fine but you really need to work with us on the code examples.
We’re happy to support you here. Our team backs the dl4j software and I’m one of the coauthors of the book. We might evaluate doing an update after the 1.0 is released but right now it’s not worth it. We’re still waiting to get one last major feature out (our samediff library that’s similar to tensorflow) before we do that then a book update makes sense.
Beyond that, you have to understand that we’re people with customers and have to cope with both the state of the market, different user bases in addition to things like the book.
I don’t think us asking you to use a version of the software that’s up to date and that we can actually support is unreasonable. Any software team has maintenance costs and has to deal with bit rot.
This includes the book examples. So beyond a standard update we put proper planning and resources in to, you’re going to have to work with us a bit.
No problem with that, and I don’t actually wish to take up much
of your time, or any if I can help it. I’m in the same boat,
running my own company 24x7x365 and trying to keep myself
current at the same time.
I will likely only show up now and then when time allows and hope
to not be much of a bother.