ONNX import fails

I’m following this guide
problem 1)
bad link under ‘For implementing your own custom overrides please see here’

problem 2)

I am trying to import a HuggingFace Bigbird ONNX model, when calling:
SameDiff graph = onnxFrameworkImporter.runImport(file.getAbsolutePath(), Collections.emptyMap());

I get error:

Exception in thread "main" java.lang.IllegalArgumentException: No import process defined for Unsqueeze
	at org.nd4j.samediff.frameworkimport.registry.OpMappingRegistry.lookupOpMappingProcess(OpMappingRegistry.kt:129)

I have downloaded and put under src/main/resources these files.

I’m using JDK1.8, DL4J 1.0.0-M1.1

my save command was:

torch.onnx.export(model,               # model being run
                  torch.tensor(train_input).to(device), # model input (or a tuple for multiple inputs)
                  dataset+"_hf_bigbird.onnx",# where to save the model (can be a file or file-like object)
                  export_params=True,        # store the trained parameter weights inside the model file
                  opset_version=11,          # the ONNX version to export the model to
                  do_constant_folding=True,  # whether to execute constant folding for optimization
                  input_names = ['input'],   # the model's input names
                  output_names = ['output']  # the model's output names

@craig88 Unsqueeze is missing. I’ll actually be adding it.

For now you can see what’s implemented here:

I’ve updated the documentation page with an explanation on how to use it:

Please let me know if you want help with mapping Unsqueeze.

Much of the libnd4j framework was implemented for interopping with tensorflow, so you generally can follow the onnx tensorflow backend like here: onnx-tensorflow/unsqueeze.py at af8b6a9825c288a80b8dd1b6edd1ea0cecd1556e · onnx/onnx-tensorflow · GitHub

Samediff also has the same operations available for you.

@craig88 I added Unsqueeze can be found here:

if you’d like you can pull my branch and test it if you’d like. I am adding quite a few ops in this pull request but unsqueeze should be there for you.

That or I can try to convert your model for you if you can DM me your model.

Is this via the pom? doesn’t seem to be working:


Here is the onnx model, it’s just a smoketest.

@craig88 No…where did you get that so I can fix it? It should just be 1.0.0-SNAPSHOT or the latest version 1.0.0-M1.1

Sorry new to Maven; had the naive notion I could use your github branches/tags in the POM dependency. Anyway I’ll wait till the updates are officially released; can use the onnxruntime until then.

@craig88 could you clarify? We have the nd4j-onnxruntime bindings that will allow an easy transition. I’m not sure if you saw those:

For your maven setup, I would seriously consider reading a quickstart to make sure you understand the basics. It won’t take long and will save you lots of headaches and unnecessary detours in the future.

You can only pull maven dependencies from a maven repository (maven central by default) You can’t just pull random git commits. Those commits and versions have to be manually set and deployed by the library to maven central or a maven repository.

Beyond that, yes you can once this is merged to master.
In order to embed the unsqueeze implementation I wrote, all you have to do is declare the annotations as listed in the docs there and put it in your project.

That won’t be necessary after this is merged to master since it will be available by default.

There are other ops (sometimes custom to pytorch or other frameworks or even one off ops implemented in research papers) where you need to do the same thing and declare the dependencies in your project in order for it to work. That would allow you to just add a shiv to the import without the need to change the dependencies.

That or again if you want, you can just DM me your model and I’ll convert it for you.
After that you don’t even need the model import framework. It will be converted to our flatbuffers format which you can then load with Samediff.load(…)

If you want to use the new changes after it’s merged tp master then that’s called snapshots. You just specify 1.0.0-SNAPSHOT with the snapshots repository. You can find that here:

I put in the Unsqueeze Op and the op defs but it is not liking the function signature:

 override fun preProcess(
        op: SameDiffOp,
        sd: SameDiff,
        attributes: Map<String, Any>,
        descriptor: OpNamespace.OpDescriptor,
        outputNames: List<String>,
        isFinalOutput: Boolean
    ): HookResult {

it likes this signature:

   override fun preProcess(
            op: SameDiffOp, 
            sd: SameDiff, 
            attributes: Map<String, Any>, 
            descriptor: OpNamespace.OpDescriptor): HookResult {

I’m using: 1.0-SNAPSHOT

@craig88 We just merged some changes to master that updates the signature. That is the old one. For Unsqueeze it should work fine.

@craig88 JFYI snapshots should have this now if you want to try it.