NullPointerException org.nd4j.autodiff.samediff.internal.Variable.getOutputOfOp()

Hi,

I am finally back to working on my implementation of " Attention Is All You Need" model and made good progress (I think).

However, now I have run into a problem I am stuck on.

Below, you will find the error log.

Probably, you will need more than the error log. I have enabled both verbose and debug logging. I have plenty of logs. My code is messy but I would gladly share it with you.

Please let me know what you think may be going on.

Thanks,

Alex Donnini

ERROR LOG

Exception in thread "main" java.lang.RuntimeException: java.lang.NullPointerException: Cannot invoke "org.nd4j.autodiff.samediff.internal.Variable.getOutputOfOp()" because the return value of "org.apache.commons.collections4.trie.PatriciaTrie.get(Object)" is null
	at org.deeplearning4j.examples.trajectorypredictiontransformer.LocationNextNeuralNetworkV7_04.fitAndEvaluateTestDataset(LocationNextNeuralNetworkV7_04.java:1542)
	at org.deeplearning4j.examples.trajectorypredictiontransformer.LocationNextNeuralNetworkV7_04.sameDiff3(LocationNextNeuralNetworkV7_04.java:960)
	at org.deeplearning4j.examples.trajectorypredictiontransformer.LocationNextNeuralNetworkV7_04.main(LocationNextNeuralNetworkV7_04.java:221)
Caused by: java.lang.NullPointerException: Cannot invoke "org.nd4j.autodiff.samediff.internal.Variable.getOutputOfOp()" because the return value of "org.apache.commons.collections4.trie.PatriciaTrie.get(Object)" is null
	at org.nd4j.autodiff.samediff.serde.FlatBuffersMapper.asFlatNode(FlatBuffersMapper.java:889)
	at org.nd4j.autodiff.samediff.serde.FlatBuffersMapper.cloneViaSerialize(FlatBuffersMapper.java:1056)
	at org.nd4j.autodiff.samediff.SameDiff.invokeGraphOn(SameDiff.java:656)
	at org.nd4j.autodiff.samediff.SameDiff.lambda$createGradFunction$1(SameDiff.java:4770)
	at org.nd4j.autodiff.samediff.SameDiff.defineFunction(SameDiff.java:4557)
	at org.nd4j.autodiff.samediff.SameDiff.defineFunction(SameDiff.java:4542)
	at org.nd4j.autodiff.samediff.SameDiff.createGradFunction(SameDiff.java:4762)
	at org.nd4j.autodiff.samediff.SameDiff.createGradFunction(SameDiff.java:4669)
	at org.nd4j.autodiff.samediff.SameDiff.fitHelper(SameDiff.java:1870)
	at org.nd4j.autodiff.samediff.SameDiff.fit(SameDiff.java:1792)
	at org.nd4j.autodiff.samediff.SameDiff.fit(SameDiff.java:1660)
	at org.deeplearning4j.examples.trajectorypredictiontransformer.LocationNextNeuralNetworkV7_04.fitAndEvaluateTestDataset(LocationNextNeuralNetworkV7_04.java:1539)
	... 2 more

@adonnini this looks like it’s due to a vairable being missing. Could you print samediff.summary()? It feels like a variable name might be undefined somewhere.

Thanks @agibsonccc When you say that probably a variable is undefined, do you mean one that in sd.summary is listed as having <none> in the Output of Function column?

I do have sd.summary printed out. It’s very big. I searched it for undefined with nothing found.

Should I be looking for records in sd.summary with specific keywords/values?

Thanks,

Alex

@adonnini Sorry please ping me if you don’t hear from me within a day or so.

Yes sd.summary(). The graph works in terms of inputs and outputs. I need to see what op it is and what its input and outputs are.

Just paste it as a github gist and I’ll look at it.

@agibsonccc Thanks for helping me out.

Here is the link to the gist

Probably, you will be horrified when you see it.

Let me know what I should do next (other than starting from scratch, I hope).

Thanks,

Alex

Looking at this it’s kinda hard to tell…this will help when I get more info though. Can you put a debugger on where it errors out?
ionNextNeuralNetworkV7_04.java sd.fit failed —
Exception in thread “main” java.lang.RuntimeException: java.lang.NullPointerException: Cannot invoke “org.nd4j.autodiff.samediff.internal.Variable.getOutputOfOp()” because the return value of “org.apache.commons.collections4.trie.PatriciaTrie.get(Object)” is null
at org.deeplearning4j.examples.trajectorypredictiontransformer.LocationNextNeuralNetworkV7_04.fitAndEvaluateTestDataset(LocationNextNeuralNetworkV7_04.java:1542)
at org.deeplearning4j.examples.trajectorypredictiontransformer.LocationNextNeuralNetworkV7_04.sameDiff3(LocationNextNeuralNetworkV7_04.java:960)
at org.deeplearning4j.examples.trajectorypredictiontransformer.LocationNextNeuralNetworkV7_04.main(LocationNextNeuralNetworkV7_04.java:221)
Caused by: java.lang.NullPointerException: Cannot invoke “org.nd4j.autodiff.samediff.internal.Variable.getOutputOfOp()” because the return value of “org.apache.commons.collections4.trie.PatriciaTrie.get(Object)” is null
at org.nd4j.autodiff.samediff.serde.FlatBuffersMapper.asFlatNode(FlatBuffersMapper.java:889)
at org.nd4j.autodiff.samediff.serde.FlatBuffersMapper.cloneViaSerialize(FlatBuffersMapper.java:1056)
at org.nd4j.autodiff.samediff.SameDiff.invokeGraphOn(SameDiff.java:656)
at org.nd4j.autodiff.samediff.SameDiff.lambda$createGradFunction$1(SameDiff.java:4770)
at org.nd4j.autodiff.samediff.SameDiff.defineFunction(SameDiff.java:4557)
at org.nd4j.autodiff.samediff.SameDiff.defineFunction(SameDiff.java:4542)
at org.nd4j.autodiff.samediff.SameDiff.createGradFunction(SameDiff.java:4762)
at org.nd4j.autodiff.samediff.SameDiff.createGradFunction(SameDiff.java:4669)
at org.nd4j.autodiff.samediff.SameDiff.fitHelper(SameDiff.java:1870)
at org.nd4j.autodiff.samediff.SameDiff.fit(SameDiff.java:1792)
at org.nd4j.autodiff.samediff.SameDiff.fit(SameDiff.java:1660)
at org.deeplearning4j.examples.trajectorypredictiontransformer.LocationNextNeuralNetworkV7_04.fitAndEvaluateTestDataset(LocationNextNeuralNetworkV7_04.java:1539)

I’m not sure what you’re using for development but either one of those if they’re maven integrated should be able to download sources for the dependency.

In the stack frame information there should be what variable it was stuck on.

I"m still convinced something is just defined wrong.

@agibsonccc Thanks. I’ll put the debugger the point of failure and let you know what I find.

I use IntelliJ IDEA.

I am a bit confused by this

I am not sure which you mean by “either one of those”. Which sources are you referring to?

Thanks,

Alex

@agibsonccc please take a look below. Is this the information you were looking for? I did not expand all items in the “Threads and Variables” panel of debug in IDEA. Please let me know.
Thanks

Exception = {NullPointerException@5194} 
 extendedMessageState = 1
 extendedMessage = null
 backtrace = {Object[7]@7095} 
  0 = {short[32]@7101} [3, 4, 149, 297, 224, 226, 231, 232, 176, 88, 89, 7, 4, 2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
  1 = {int[32]@7102} [43319296, 3670016, 22413312, 2162688, 1507328, 655360, 26673152, 327680, 34340864, 6750208, 3080192, 229310464, 273481728, 7929856, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
  2 = {Object[32]@7103} 
  3 = {long[32]@7104} [139683091420800, 139683091469632, 139683090522656, 139683091440224, 139683091414656, 139683091414656, 139683091417312, 139683091417312, 139683091386848, 139683076259456, 139683076259456, 139683076262208, 139683076247264, 139680914365384, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
 detailMessage = null
 cause = {NullPointerException@5194} "java.lang.NullPointerException: Cannot invoke "org.nd4j.autodiff.samediff.internal.Variable.getOutputOfOp()" because the return value of "org.apache.commons.collections4.trie.PatriciaTrie.get(Object)" is null"
  extendedMessageState = 2
  extendedMessage = "Cannot invoke "org.nd4j.autodiff.samediff.internal.Variable.getOutputOfOp()" because the return value of "org.apache.commons.collections4.trie.PatriciaTrie.get(Object)" is null"
   value = {byte[176]@7129} [67, 97, 110, 110, 111, 116, 32, 105, 110, 118, 111, 107, 101, 32, 34, 111, 114, 103, 46, 110, 100, 52, 106, 46, 97, 117, 116, 111, 100, 105, 102, 102, 46, 115, 97, 109, 101, 100, 105, 102, 102, 46, 105, 110, 116, 101, 114, 110, 97, 108, 46, 86, 97, 114, 105, 97, 98, 108, 101, 46, 103, 101, 116, 79, 117, 116, 112, 117, 116, 79, 102, 79, 112, 40, 41, 34, 32, 98, 101, 99, 97, 117, 115, 101, 32, 116, 104, 101, 32, 114, 101, 116, 117, 114, 110, 32, 118, 97, 108, 117, +76 more]
   coder = 0
   hash = 0
   hashIsZero = false
  backtrace = {Object[7]@7095} 
   0 = {short[32]@7101} [3, 4, 149, 297, 224, 226, 231, 232, 176, 88, 89, 7, 4, 2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
   1 = {int[32]@7102} [43319296, 3670016, 22413312, 2162688, 1507328, 655360, 26673152, 327680, 34340864, 6750208, 3080192, 229310464, 273481728, 7929856, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
   2 = {Object[32]@7103} 
    0 = {Class@3954} "class org.nd4j.autodiff.samediff.serde.FlatBuffersMapper"
    1 = {Class@3954} "class org.nd4j.autodiff.samediff.serde.FlatBuffersMapper"
    2 = {Class@925} "class org.nd4j.autodiff.samediff.SameDiff"
    3 = {Class@925} "class org.nd4j.autodiff.samediff.SameDiff"
    4 = {Class@925} "class org.nd4j.autodiff.samediff.SameDiff"
    5 = {Class@925} "class org.nd4j.autodiff.samediff.SameDiff"
    6 = {Class@925} "class org.nd4j.autodiff.samediff.SameDiff"
    7 = {Class@925} "class org.nd4j.autodiff.samediff.SameDiff"
    8 = {Class@925} "class org.nd4j.autodiff.samediff.SameDiff"
    9 = {Class@925} "class org.nd4j.autodiff.samediff.SameDiff"
    10 = {Class@925} "class org.nd4j.autodiff.samediff.SameDiff"
    11 = {Class@869} "class org.deeplearning4j.examples.trajectorypredictiontransformer.LocationNextNeuralNetworkV7_04"
    12 = {Class@869} "class org.deeplearning4j.examples.trajectorypredictiontransformer.LocationNextNeuralNetworkV7_04"
    13 = {Class@869} "class org.deeplearning4j.examples.trajectorypredictiontransformer.LocationNextNeuralNetworkV7_04"
   3 = {long[32]@7104} [139683091420800, 139683091469632, 139683090522656, 139683091440224, 139683091414656, 139683091414656, 139683091417312, 139683091417312, 139683091386848, 139683076259456, 139683076259456, 139683076262208, 139683076247264, 139680914365384, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
  detailMessage = null
  cause = {NullPointerException@5194} "java.lang.NullPointerException: Cannot invoke "org.nd4j.autodiff.samediff.internal.Variable.getOutputOfOp()" because the return value of "org.apache.commons.collections4.trie.PatriciaTrie.get(Object)" is null"
   extendedMessageState = 2
   extendedMessage = "Cannot invoke "org.nd4j.autodiff.samediff.internal.Variable.getOutputOfOp()" because the return value of "org.apache.commons.collections4.trie.PatriciaTrie.get(Object)" is null"
   backtrace = {Object[7]@7095} 
   detailMessage = null
   cause = {NullPointerException@5194} "java.lang.NullPointerException: Cannot invoke "org.nd4j.autodiff.samediff.internal.Variable.getOutputOfOp()" because the return value of "org.apache.commons.collections4.trie.PatriciaTrie.get(Object)" is null"
   stackTrace = {StackTraceElement[14]@7099} 
   depth = 14
   suppressedExceptions = {Collections$EmptyList@7097}  size = 0
  stackTrace = {StackTraceElement[14]@7099} 
  depth = 14
  suppressedExceptions = {Collections$EmptyList@7097}  size = 0
 stackTrace = {StackTraceElement[14]@7099} 
  0 = {StackTraceElement@7105} "org.nd4j.autodiff.samediff.serde.FlatBuffersMapper.asFlatNode(FlatBuffersMapper.java:889)"
  1 = {StackTraceElement@7106} "org.nd4j.autodiff.samediff.serde.FlatBuffersMapper.cloneViaSerialize(FlatBuffersMapper.java:1056)"
  2 = {StackTraceElement@7107} "org.nd4j.autodiff.samediff.SameDiff.invokeGraphOn(SameDiff.java:656)"
  3 = {StackTraceElement@7108} "org.nd4j.autodiff.samediff.SameDiff.lambda$createGradFunction$1(SameDiff.java:4770)"
  4 = {StackTraceElement@7109} "org.nd4j.autodiff.samediff.SameDiff.defineFunction(SameDiff.java:4557)"
  5 = {StackTraceElement@7110} "org.nd4j.autodiff.samediff.SameDiff.defineFunction(SameDiff.java:4542)"
  6 = {StackTraceElement@7111} "org.nd4j.autodiff.samediff.SameDiff.createGradFunction(SameDiff.java:4762)"
  7 = {StackTraceElement@7112} "org.nd4j.autodiff.samediff.SameDiff.createGradFunction(SameDiff.java:4669)"
  8 = {StackTraceElement@7113} "org.nd4j.autodiff.samediff.SameDiff.fitHelper(SameDiff.java:1870)"
  9 = {StackTraceElement@7114} "org.nd4j.autodiff.samediff.SameDiff.fit(SameDiff.java:1792)"
  10 = {StackTraceElement@7115} "org.nd4j.autodiff.samediff.SameDiff.fit(SameDiff.java:1660)"
  11 = {StackTraceElement@7116} "org.deeplearning4j.examples.trajectorypredictiontransformer.LocationNextNeuralNetworkV7_04.fitAndEvaluateTestDataset(LocationNextNeuralNetworkV7_04.java:1539)"
  12 = {StackTraceElement@7117} "org.deeplearning4j.examples.trajectorypredictiontransformer.LocationNextNeuralNetworkV7_04.sameDiff3(LocationNextNeuralNetworkV7_04.java:960)"
  13 = {StackTraceElement@7118} "org.deeplearning4j.examples.trajectorypredictiontransformer.LocationNextNeuralNetworkV7_04.main(LocationNextNeuralNetworkV7_04.java:221)"
 depth = 14
 suppressedExceptions = {Collections$EmptyList@7097}  size = 0
sameDiff = {SameDiff@5196} "SameDiff(nVars=856,nOps=566)"
 variables = {PatriciaTrie@7019}  size = 856
 ops = {LinkedHashMap@7020}  size = 566
 sessions = {ConcurrentHashMap@7021}  size = 1
 sequences = {ConcurrentHashMap@7022}  size = 0
 constantArrays = {ThreadSafeArrayHolder@7023} 
 variablesArrays = {ThreadSafeArrayHolder@7024} 
 eagerArrays = {ThreadSafeArrayHolder@7025} 
 otherPlaceHoldersPerThread = {ConcurrentHashMap@7026}  size = 0
 placeholdersPerThread = {ConcurrentHashMap@7027}  size = 0
 lossVariables = {ArrayList@7028}  size = 1
 listeners = {ArrayList@7029}  size = 0
 nameScopes = {ArrayList@7030}  size = 5
 outputs = null
 eagerMode = false
 enableCache = true
 trainingConfig = {TrainingConfig@7031} "TrainingConfig(updater=Nesterovs(learningRate=0.01, learningRateSchedule=null, momentum=0.9, momentumISchedule=null, momentumSchedule=null), regularization=[], minimize=true, dataSetFeatureMapping=[input], dataSetLabelMapping=[label], dataSetFeatureMaskMapping=null, dataSetLabelMaskMapping=null, lossVariables=[createAndConfigureModel - 33256075/Training - 44032318/Training - 64139635/createAndConfigureModel - 5379418/lossMSE], iterationCount=0, epochCount=0, initialLossDataType=FLOAT, trainEvaluations={outReduced - 2445088=[Evaluation: No data available (no evaluation has been performed)]}, trainEvaluationLabels={outReduced - 2445088=0}, validationEvaluations={}, validationEvaluationLabels={})"
 initializedTraining = true
 updaterMap = {HashMap@7032}  size = 141
 variableId = 0
 math = {SDMath@7033} 
 random = {SDRandom@7034} 
 nn = {SDNN@7035} 
 cnn = {SDCNN@7036} 
 rnn = {SDRNN@7037} 
 loss = {SDLoss@7038} 
 image = {SDImage@7039} 
 bitwise = {SDBitwise@7040} 
 linalg = {SDLinalg@7041} 
 sameDiffFunctionInstances = {LinkedHashMap@7042}  size = 0
 wasRegistered = {AtomicBoolean@7043} "false"
 debugMode = false
 argumentInterceptors = {Stack@7044}  size = 0
 pausedArgumentInterceptors = {HashSet@7045}  size = 0
 blockNames = {HashSet@7046}  size = 0
 logExecution = true
 parent = null
 child = null
 sd = {SameDiff@5196} "SameDiff(nVars=856,nOps=566)"
node = {MeanSquaredErrorLoss@5197} "mean_sqerr_loss"
 lossReduce = {LossReduce@7007} "MEAN_BY_NONZERO_WEIGHT_COUNT"
 opName = null
 inputArguments = {ArrayList@7008}  size = 0
 outputArguments = {ArrayList@7009}  size = 0
 tArguments = {ArrayList@7010}  size = 0
 iArguments = {ArrayList@7011}  size = 1
 bArguments = {ArrayList@7012}  size = 0
 dArguments = {ArrayList@7013}  size = 0
 sArguments = {ArrayList@7014}  size = 0
 axis = {ArrayList@7015}  size = 0
 inplaceCall = false
 hash = 0
 outputVariables = {SDVariable[1]@5211} 
  0 = {SDVariable@6890} "SDVariable(name="createAndConfigureModel - 33256075/Training - 44032318/Training - 64139635/createAndConfigureModel - 5379418/lossMSE",variableType=ARRAY,dtype=FLOAT)"
 outputShapes = null
 sameDiff = {SameDiff@5196} "SameDiff(nVars=856,nOps=566)"
 inPlace = false
 scalarValue = null
 dimensions = null
 extraArgs = null
 ownName = "createAndConfigureModel - 33256075/Training - 44032318/Training - 64139635/createAndConfigureModel - 5379418/mean_sqerr_loss"
 ownNameSetWithDefault = true
bufferBuilder = {FlatBufferBuilder@5198} 
 bb = {HeapByteBuffer@7002} "java.nio.HeapByteBuffer[pos=0 lim=1024 cap=1024]"
 space = 1024
 minalign = 1
 vtable = null
 vtable_in_use = 0
 nested = false
 finished = false
 object_start = 0
 vtables = {int[16]@7003} [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
 num_vtables = 0
 vector_num_elems = 0
 force_defaults = false
 bb_factory = {FlatBufferBuilder$HeapByteBufferFactory@7004} 
 utf8 = {Utf8Safe@7005} 
variables = {ArrayList@5199}  size = 856
reverseMap = {HashMap@5200}  size = 856
forwardMap = {HashMap@5201}  size = 0
framesMap = {HashMap@5202}  size = 0
idCounter = {AtomicInteger@5203} "0"
id = {Integer@5204} 0
opName = "mean_sqerr_loss"
 value = {byte[15]@6892} [109, 101, 97, 110, 95, 115, 113, 101, 114, 114, 95, 108, 111, 115, 115]
 coder = 0
 hash = -817134891
 hashIsZero = false
hash = 5986466946445196344
extras = {double[0]@5206} []
boolArgs = {boolean[0]@5207} []
dtypeArgs = null
extraBits = {long[1]@5208} [3]
extraStringIds = null
sArgs = null
inPaired = {ArrayList@5209}  size = 0
outputIds = {int[1]@5210} [117]
 0 = 117
outputVertexId = {SDVariable[1]@5211} 
 0 = {SDVariable@6890} "SDVariable(name="createAndConfigureModel - 33256075/Training - 44032318/Training - 64139635/createAndConfigureModel - 5379418/lossMSE",variableType=ARRAY,dtype=FLOAT)"
inputs = {SDVariable[3]@5212} 
 0 = {SDVariable@5213} "SDVariable(name="",variableType=VARIABLE,dtype=FLOAT)"
 1 = {SDVariable@6886} "SDVariable(name="createAndConfigureModel - 33256075/Training - 44032318/Training - 64139635/createAndConfigureModel - 5379418/sd_var_74",variableType=CONSTANT,dtype=FLOAT)"
 2 = {SDVariable@6887} "SDVariable(name="label",variableType=PLACEHOLDER,dtype=FLOAT,shape=[-1, 2, -1])"
input = {SDVariable@5213} "SDVariable(name="",variableType=VARIABLE,dtype=FLOAT)"
 sameDiff = {SameDiff@5196} "SameDiff(nVars=856,nOps=566)"
 varName = ""
 variableType = {VariableType@6881} "VARIABLE"
 shape = {long[3]@6882} [128, 2, 69]
 dataType = {DataType@6883} "FLOAT"
 creator = null
varName = ""
 value = {byte[0]@6880} []
 coder = 0
 hash = 0
 hashIsZero = true

@adonnini looking at this in depth, try to use intellij to put a conditional breakpoint on where the exception happens.
You should be able to show me what the actual variable state is.

I still think it’s a disconnected node in your graph.

Can you maybe just post the samediff code? I’m not sure how complex it is but it’s still hard to understand what’s going on with just the input here. I was hoping to see what variables are present there.

@agibsonccc I created a public repository containing the code and other necessary folder and files. You can access it at

I include instructions for running the code in README.MD

Please let me know if you encounter any problems.

By the way, you’ll see that the code is (over)full of log statements and you’ll probably find it pretty messy. If you think it’s just too messy to make any sense of it, I completely understand. No problem at all.
I really appreciate your offer to help me out.
Thanks,
Alex

@adonnini sorry it took me a bit to download this and look at it. My hesistance in looking at the repo is your code is never runnable out of the box…it’s just hard to dissect. I don’ tmind helping here but when I have to wade through half a desert to do so it’s hard to make the time.
It’s one thing to take a quick look it’s another to take 30 minutes to an hour to reverse engineer and understand what I can about your whole code base.

Sorry just please in the future it’d be much easier for me to make quick judgement calls if things were cleaner.

First things first I am NOT going to try to clean all of this up.

If I can not just import this into intellij as a standard maven/gradle project out of the box with zero configuration on my part I am not going to spend time on this beyond doing what I’m about to do here.
Let me give you some blanket recommendations:

  1. Don’t create SDVariables with new. Use samediff.something (including sd.var,sd.const etc) to create your variables.
  2. Try to use samediff declaratively like in the examples. Set it up like dl4j where you declare the layers. Samediff is more flexible but the same idea applies here.

A broader idea on how samediff works: it works declaratively. That means as you use an instance of a samediff model and add things to it it adds to an underlying graph structure.
You create variables/constants and declare ops like inputs/outputs. Those become 1 or more sd variables internally. Those are all stored in a dictionary (which is what’s failing here) The failure here is somewhere in the graph structure. An output of an op isn’t being associated with something else in the graph which is where your code is failing.

When creating variables, pass in names to your variables. All samediff declarations allow that. Otherwise they’re just given UUIDs.
Try to pass in strings to each call. If you can refactor this I might be able to troubleshoot this with just summary.

If I could figure out how to run this, or get an idea of the graph structure I would be able to help you a bit. Unfortunately when I originally asked you for the summary I anticipated some sort of structure but it’s just hard to decipher. Usually bigger graph sizes don’t matter if there’s at least variable names. That would have allowed me to point you in the right direction.

@agibsonccc Thanks for the very helpful response.

I have been trying to give all sd variables a name as you suggest that I di. I will check all the code to make sure that all sd variables have a name.

Below, you will find step-by-step instructions for cloning the repository containing the code and running the code. If this is not enough, please let me know and I will proceed on my own.

Please note that I probably made these instructions too detailed especially for someone as experienced as you are. I just wanted to make sure I was as clear as possible. Please let me know if I made any mistakes or something is not clear.

Again, I really appreciate your willingness to help me.

Thanks,

Alex

INSTRUCTIONS FOR CLONING AND RUNNING TrajectoryPredictionTransformer REPOSITORY AND RUNNING trajectory perdiction transformer model
=======================================================================================================================
1) Go to 
https://github.com/adonnini/trajectory-prediction-transformersContextQ/tree/master

2) Click on "Code" button

3) Click on "Copy URL to clipboard" icon

4) On you system, open a terminal session

5) go to the directory where you want the cloned repository to be located

6) Run
git clone -b master --single-branch https://github.com/adonnini/trajectory-prediction-transformersContextQ.git

7) Open Intellij IDEA

8) On the menu bar click on File >>> Open

9) Navigate to the location of the cloned repository

10) Expand trajectory-prediction-transformersContextQ

11) Expand deeplearning4j-examples-master_1

12) Open TrajectoryPredictionTransformer

13) In the TrajectoryPredictionTransformer opened In Intellij IDEA, expand src/main/java/trajectorypredictiontransformer

14) Open LocationNextNeuralNetworkV7_04.java

15) Compile LocationNextNeuralNetworkV7_04.java (not necessary)

16) Run LocationNextNeuralNetworkV7_04.java

IMPORTANT NOTE: AFTER EVERY EXECUTION OF LocationNextNeuralNetworkV7_04.java YOU NEED TO DELETE THE FOLDER uci in
TrajectoryPredictionTransformer/src/main/resources/
I am sorry about this. I know it's a nuisance and could be "easily" fixed by deleting the folder from the code after execution of training etc., etc. completes. I just have not gotten around to it.

@adonnini no it’s not about your instructions.

I don’t see a build.gradle, a standard maven directory structure or anything in your project. The root of your repo is literally missing anything resembling a build tool. I’m guessing this is stripped off from a larger project or something?

The most I’d want to do when running a 1 off project is “import from this folder” that should be very doable if all you do is put the maven configuration in there (some combination of nd4j-native-platform and deeplearning4j-nn) and a minimal reproducer in there.
I don’t think I’m asking for very much here. Otherwise as I mentioned I’ll continue to suggest what I can if you can point to how you’re building your model.
Organize it better first if it’s too much work to setup that kind of directory structure.

@agibsonccc if you follow my instructions you will be able to run the code.
I am very familiar with gradle. In this case, you do not need anything other than what is in the repository. The entire project builds succesfully and LocationNextNeuralNetworkV7_04 compiles also successfully.
There is a pom.xml in TrajectoryPredictionTransformer.

Before sending you the instructions, I tried them and they work, i.e. I could run the code, i.e. run LocationNextNeuralNetworkV7_04.

If you do not feel comfortable proceeding, or you feel it would be too time consuming, I understand. Thanks for what you have done so far.
Thanks,
Alex

@adonnini please make it to conform with whatever you’re comfortable with. I’m not questioning whether you know the tool or not. I’m assuming you do.

You clearly have a local configuration that works for you. 1 off instructions for what should be standard procedure for sharing a project I still do not think is too much to ask.

All I’m asking for is a basic build.gradle or pom.xml for your project with the needed dependencies that I can import in 1 click with intellij. Try to meet me half way with the reproducer.

Following a non standard set of instructions is a barrier I’m just not willing to cross time wise. You know as well as anyone I have limited bandwidth. I don’t think it’s too much to ask here.

Putting up barriers to me helping you isn’t really helping either of us. I don’t think it’s too much to ask and it’s a standard any user I’ve asked for this from has been able to follow. I wouldn’t ask for it if it wasn’t industry standard well supported tech.

@agibsonccc The project does have a pom.xml. Otherwise, it could not be built and the code would not run. Although my instructions have several steps, all in all they should not take more than 5-10 minutes to follow to the end.
Unfortunately, I think we may be at an impasse. Doing what you are asking may be beyond my skills. Thanks anyway for your generaour offer to heelp.
Thanks,
Alex

@adonnini just go ahead and clean up your sd variables and I’ll take a look at your declarations then. Try to follow my clean up advice to make it easier to read your summary and I’ll try to decipher your graph structure from that.

Edit: Let me at least give you something to consider.
At most you probably need 2 pom declarations:
nd4j-native-platform 1.0.0-M2.1
deeplearning4j-nn 1.0.0-M2.1.

There might be 2 deps you need to make this work with 1 import.

All you have to do is create a new maven project in intellij, put those 2 dependencies in there and put the subset of your code you need to look at the structure of in there.

That would allow any java developer who works with those tools (including me and 90% of the industry) to import your project without fuss.

It’s your discretion whether to try that or not but that’s the standard bar I set for anyone posting here (Or on our old mailing lists from 10 years ago for that matter)

One thing I’ve unfortunately had to do to even keep providing support is to just set standard boundaries for reproducing issues. Debugging end user code + their issue + potentially a bug in dl4j itself if there is one at scale (read: > 1 person over time) just burns you out. Sorry if the boundary for this specific circumstance seems arbitrary but it’s really just a hard boundary I’ve set for helping folks.

@adonnini just following up here I wanted to make sure I wasn’t missing something.:

Look at your repo again. There isn’t a pom in there. You clearly have a maven directory structure here. Why can’t you just put the file there and then I can look at it? I feel like we’re fairly close here. If all you need to do is strip the code down a bit and add your pom, could you clarify the issue?

@agibsonccc in the instructions, the URL for the repo is

I think you were looking at the first repo whose link I sent you a few days ago

@adonnini oh this looks much better! Yeah let me take a look. Sorry not sure why the forum didn’t ping me about this. Please do send me an @ I usually rely on notifications to reply to questions. Thanks!