Minimal version for inference

I’ve used the full version of DL4J to train a model (relatively simple MultiLayerNetwork). I’d now like to be able to do inference using this model in my software, but not have to bundle all of DL4J. Is there any guidance on embedding a subset of DL4J? Just enough to run the model, assuming I provide all the code to generate appropriate inputs to the network.

@bjohnson I assume maybe you’re using deeplearning4j-core? Usually you can get away with deeplearning4j-nn and an nd4j backend. You can also trim dependencies by specifying the platform you’re building for. See here for more information on that:

Thanks for the suggestions. I’ve been using deeplearning4j-core, so I’ll check out just using deeplearning4j-nn. I think that’s exactly what I want.

@bjohnson great yeah. Usually deepleanring4j-core has a bunch of extra things with it, but the configuration api in deeplearning4j-nn is pretty small. Feel free to ask if anything else comes up.