Optimising dl4j deployment size


As I discussed in another thread recently, I need to deploy DL4j using a megajar (or fat jar, if you prefer), with all of the DL4j resources exported in it. This file is 1.2GB large.

For my purposes, I simply need to load a multilayernetwork and normalizer from disk, and compare a new value against the network (model.output). By manually entering the file with a software such as WinRar, I am able to delete unneeded files (e.g. android libraries, etc). This reduces it down to 800MB.

Although I would prefer one JAR for all operating systems, I can cut it down to 400MB if I make a JAR for just one operating system. But ideally I would like it to be ~100MB large.

I feel like going in and deleting classes manually with WinRar is a poor solution, would you have a better recommendation for optimising the dl4j size?

This will help you reduce the jar size by only including the necessary binaries for the deployment platform.

You probably also want to reduce the other dependencies too, as deeplearning4j-core will pull in a lot of things that aren’t exactly necessary for your use case.

Try starting with just deeplearning4j-nn and then add whatever you are missing one by one to get to the minimal size.