NullPointerException org.nd4j.autodiff.samediff.internal.Variable.getOutputOfOp()

@adonnini if you ever get around to it I’m happy to still test your code on my end. My ask is still the same: the 1 liner. That should be doable on your end since it’s your code. If you want maybe try giving it to claude or chatgpt. Maybe one of the coding tools can help you with extracting everythign.

@agibsonccc
I would be glad to do as you ask. However, I am not sure I understand what you are looking for.

Are you looking for a single source (Java) file to run my model? If that is what you are looking for, I would need to create it. It would require a complete restructuring of the code as currently I have ten modules in the project.

I am sorry if I misunderstood what you are asking.

May I ask how giving you one file executable would help address the flatbuffers issue now that we have established that the issue is flatbuffers release independent and in order to resolve it looks like it would require changes to your code base?

Thanks,

Alex

@agibsonccc I may have completely misunderstood your request. If I have I am sorry.

You can run my model by running

Just download my entire repository and run the file.

Please keep in mind that after each execution you have to remove

src/main/resources/uci

Please let me know if you have any questions,

Thanks

@agibsonccc I don’t know if you had a chance to take a look at my latest message above. Please let me know if I did understand correctly what you were looking for.

Thanks

@adonnini oh thanks for assembling that. Let me give that a shot. Sorry been heads down on something.

@agibsonccc I hope I am not being too pushy. Do you think you will have time to run my model during the next couple of weeks?
Thanks

@adonnini no you’re not! I’m just in the middle of finishing this PR: Fixes reshape relates segfaults, convert more tad pack related functions to use normal pointers to avoid java related deallocation isssues by agibsonccc · Pull Request #10206 · deeplearning4j/deeplearning4j · GitHub as soon as I have a good enough branch for training runs your model will be a great test case actually!

I should wrap this up within the week sometime then you were next on my list. I need to triple check that this new > 2g RAM flatbuffers format works.

@adonnini I attempted to look through your code but I’m sorry I’m not going to dismantle your whole repo to attempt to pull out what should just be a samediff configuration. I’ll just try to setup my own test case for this. All I see is a ton of commented code, what looks like a wrapper and some code running. Literally all I need is:

SameDiff sd = SameDiff.create();
sd.(..) something to setup your variables and configuration.

Usually people put those in self contained utility methods if only for their own sanity. While I’d like to use your stuff here I’m not going to put an undetermined amount of time in to reverse engineering this.

The issue of “will it or won’t it load” should just be a “run selfcontained main method with config and attempt to save model, throw error”

That should all be that’s involved here.

@adonnini I was able to reproduce the issue now. Let me see what’s going on with flatbuffers yet and I’ll see how to get something fixed up.

@adonnini so after some design work I think I can have something that’s doable in pure java. It’ll end up being backwards compatible with the current loader as well so it shouldn’t affect you.

I will need to write a new serializer that will work with external data. I’m playing with the right way to do that and narrowed it down to a few approaches. Give me some time next week to sort this out.

After looking at it, the new 64 bit support in flatbuffers never landed in the other languages, just c++. Due to the yet another native dependency that would already add to this I’m just going to opt for an easier to use version. I’ll post more updates as I progress on this.

Thanks very much @agibsonccc I am sorry it’s taking you quite a bit of time

@adonnini give me another day on this. First test passes but I need to do a bit more refining:

This ended up being quite the change and will require separate classes but should be doable with M2.1 if you pull it in to a separate project. You’ll need to put in the work to make this work though. The main issue is the new flatbuffers classes I had to generate for this. That will mean you’ll need the serializer class and those java classes. Theoretically there’s no reason it shouldn’t work with that.

@adonnini I"m not going to write out the “how to” for this today. I know you’ll need that to proceed. I’m done on the basics though. It’ll be a single file zip file with shards (basically parts of the model when it’s too big)

You’ll use a utility class in pure java instead of the existing .save method. As I said due to this not being something you just call .save on you’ll unfortunately be exposed to the details but it at least should be usable with a bit of work on your end.

@agibsonccc Thank you so much!

@adonnini so how do you want to do this? I think as a next step I’ll touch up the snapshot builds and let you try it like that. It’s still very much WIP and there’s a lot of code I need to fix yet but it should allow you to run your model.

You could also try taking the java classes from the project and running the serializer standalone. That was what I was going to document for you but I have a feeling it’s not going to work for yo. Do you want to try?

@agibsonccc I would like to go with the snapshot builds. Let me know whether I need to download, or the updated dependencies.
Thanks

@adonnini ok that’s what I figured. Give me a bit on that then.