@SidneyLann I think we already had this discussion on github. We need an einsum op that or you need to map the equivalent calls that einsum would decompose to. @quickwritereader mentioned that to you before I believe.
Those ops would then have a backprop. tf.nest.map_structure would be similar.
I would recommend a pre process hook based implementation for that if you want to try that (something like this but for TF both TF and onnx share the same core logic so this will work for TF as well):
Before I elaborate on that could you clarify you’d be ok with trying that?
Indeed I have no py skill, and I don’t know what the two functions do. Would you please give the pseudocode in SameDiff to show what the two functions do?
@SidneyLann If you don’t have the time to write the code for this then I don’t know what to tell you. I have internal products/use cases and customers + the day to day to run. I can help support you in this but beyond implementing features already slated for the roadmap it’s kind of hard to dive in to all this in depth. Generally that would be a few hours.
If you can meet me in the middle a bit(eg: help me help you) then there’s probably something we can do here.
That would mean you diving in to TF yourself to figure out the right calls to make. Most of the functions are there but when you ask for “psuedo code” for it you’re actually asking us to just do it.
You’re not really asking for help here you’re more asking for someone to implement the whole library for you. There’s not much to do but to just treat this as a feature request to which I’ll say the same as before: it’s not really a priority and would have to be a contribution.
Right now for new features onnx import and expanding the model zoo is going to take priority.
Beyond that bug reports are also more of a focus.
Only need less then 10 lines of psuedo code in SameDiff to show idea of implementing tf.nest.map_structure(…) and tf.keras.layers.experimental.EinsumDense( “…ik,ki->…i”,…), no need for whole tf-gat.
I had dive into tf-gat and confirm I can port it in SameDiff except these 2 functions. And these 2 functions are normal ops, not related to gnn.