Assign value to a placeholder

Thanks. The file does exist. I checked manually and programmatically.

Here is the output:

 saveFileForInference exists --- 
 saveFileForInference.getAbsolutePath() - /home/adonnini1/Development/ContextQSourceCode/NeuralNetworks/deeplearning4j-examples-master_1/dl4j-examples/src/main/assets/location_next_neural_network_v6_07.fb
 saveFileForInference.length() - 8664

and, here is the trace:

Exception in thread "main" java.lang.NullPointerException
	at org.nd4j.linalg.api.ops.impl.layers.recurrent.LSTMLayer.configureWithSameDiff(LSTMLayer.java:210)
	at org.nd4j.autodiff.samediff.SameDiff.fromFlatBuffers(SameDiff.java:6058)
	at org.nd4j.autodiff.samediff.SameDiff.fromFlatFile(SameDiff.java:5914)
	at org.nd4j.autodiff.samediff.SameDiff.fromFlatFile(SameDiff.java:5895)
	at org.deeplearning4j.examples.quickstart.modeling.recurrent.LocationNextNeuralNetworkV6_03.sameDiff3(LocationNextNeuralNetworkV6_03.java:402)
	at org.deeplearning4j.examples.quickstart.modeling.recurrent.LocationNextNeuralNetworkV6_03.main(LocationNextNeuralNetworkV6_03.java:155)

Do you still think the file is missing? Please keep in mind that I went through the same steps, saving and retrieving saved networks, when implementing with dl4j, and never had any problems or spent so much time resolving what appears to be a straightforward mistake on my part.

@adonnini that actually might be a bug. DM me your file so I can ensure this is fixed with the latest release. I’m 99% sure it is.

Hi Adam,

The saved network file is attached to this message.

Should I upgrade to the latest release? Currently, I am using 1.0.0-M2.1

On a different subject,

I have not been able to get visualization to work when using samediff.
SameDiff does not support StatsListener, as far as I can tell. I tried
using ScoreListener without success. Any pointers, or sample code? With
dl4j, I was able to activate visualization using the code in your
documentation.

Thanks,

Alex

(Attachment location_next_neural_network_v6_07.fb is missing)

Stupid question. How can I send you a file that is not jpg, png or other format allowed for attachments? Sorry

@adonnini ah sorry thanks for reminding me. No you’d need to build from source to do that. I’ll have it ready soon. I’m finishing up cuda tests now.

Here’s an example:

 IrisDataSetIterator iter = new IrisDataSetIterator(150, 150);

        SameDiff sd = getSimpleNet();

        File dir = testDir.toFile();
        File f = new File(dir, "logFile.bin");
        UIListener l = UIListener.builder(f)
                .plotLosses(1)
                .trainEvaluationMetrics("softmax", 0, Evaluation.Metric.ACCURACY, Evaluation.Metric.F1)
                .updateRatios(1)
                .build();

        sd.setListeners(l);

        sd.setTrainingConfig(TrainingConfig.builder()
                .dataSetFeatureMapping("in")
                .dataSetLabelMapping("label")
                .updater(new Adam(1e-1))
                .weightDecay(1e-3, true)
                .build());

        sd.fit(iter, 20);

        //Test inference after training with UI Listener still around
        Map<String, INDArray> m = new HashMap<>();
        iter.reset();
        m.put("in", iter.next().getFeatures());
        INDArray out = sd.outputSingle(m, "softmax");

Of note, please feel free to poke around in the tests in the deeplearning4j repo if something’s missing from the examples. That will often give you some good answers.

Thanks. This is helpful.

Using the code in the example you sent me, I was able to produce a UIListener log file in my chosen location.

However, unless I am mistaken, UIServer.attach supports only a parameter of StatsStorage type

So, I don’t see how I can pass the file produced by UIListener to UIServer.

By the way, it looks like there is no option to keep the UIListener file in memory. At least, I could not find one.

Is there a different UIServer used with SameDiff?

Thanks

Hi Adam,

I re-read your answer

SameDiff does not let you save the normalizer used when preparing the data as it is possible with dl4j.

This means that when I revertLabels on the IND array produced by output on the Android side, I do not get the actual values of the labels but the normalized values.

I guess I am trying to figure out what is the equivalent of revertLabels in samediff and how I carry over to the Android side the normalizer I used when prepraring data for the network.

I have one idea for solving this problem: When running inference in the Android app to revert labels use a normalizer from a network saved with dl4j. Do you think this would work?

What am I missing or doing wrong?

Thanks

@adonnini sorry not quite getting the problem. What is stopping you from loading the normalizer separately? The normalizer has its own save and load methods.

Sorry. You are right. Missing the obvious.

When do you think the upcoming release will be ready? I would like to try restoring the saved samediff network in my Android app and riun inference using it.

Thanks

Great just do that and you should be fine. Let me know if you have any other issues. Note that when you use dl4j normally you just restore the normalizer Samediff isn’t as easy to use there. That is one thing I could polish after the release.

Thanks. I can save the normalizer using dl4j and retrieve it also using dl4j since the normalizer code is exactly the same in samediff and dl4j.

Please let me know about the new release when you have a chance.

Thanks

As it stands cpu is done just need to finish gpu. Right now I am just on vacation this week. Will be back after that and will publish M3 docs and release notes after testing.

Sorry. I didn’t know that. Enjoy the rest of your vacation!

Hi Adam,
Were you able to complete the M3 release process? Without a couple of fixes in the new release, I cannot test networks created with SameDiff in my Android application.
Thanks,
Alex

@adonnini unfortunately not yet. I would announce it when I did. Sorry for the late reply just ping me as usual. Unfortunately, I’m still in the middle of cleaning up some technical debt yet. I’m investigating a fairly hairy aspect of the project that I won’t bother explaining unless you want a c++ lesson :slight_smile: Due to the enterprise usage and the productization plans I have the last year has been focused on stripping the framework down and right now I’m in the middle of a fairly large set of cuda changes. No rewriting anything but I am changing some aspect of the internals that’s been tricky to debug.

Much of that has to do with me rerunning the 1000s of tests the framework has over and over again trying to nail things like deallocation race conditions, improving tooling around debugging, improving build scripts.

Much of that work isn’t user facing or anything you’d even see on the surface beyond maybe me removing features or just general stability.

Along the way I’ve been documenting these things as well.

Hi Adam,
I hope I am not being too much of a nuisance.
How are things coming along with release 3?
Thanks