Batch-size-agnostic ReshapeVertex?

I have a question about ReshapeVertex. Many layer types can be used with inputs of any batch size. I can’t find much in the way of example code using ReshapeVertex, but it seems to require that the actual batch size is included in the output dimensions. For example, if individual inputs are vectors of length 100 and I want to transform them to 10x10 matrices, I can’t use ReshapeVertex(10, 10), but rather ReshapeVertex(n, 10, 10), where n is the batch size.

Is it possible to employ ReshapeVertex without committing the network to a particular batch size? Alternatively, is there another layer or vertex type that can be used to reshape its input without knowing the batch size ahead of time? Thanks!

I don’t see any reason why we couldn’t add this. Mind filing an issue? htps://github.com/eclipse/deeplearning4j/issues?

If you want, you could also do a PR:

Unfortunately, there’s not really a workaround unless you want to just do a custom vertex yourself for your use case:

maybe im way off but isnt a 1x10x10 the same as a 10x10?

Kind of, the issue is right now it only reshapes to the exact shape. There’s no built in functionality for saying “this subset of dimensions + tolerance for a changing number on the first axis”