OutOfMemoryError and Memory Management

Hi,

I am currently working on a scenario with a somewhat bigger observation space (541 to be exact). Unfortunately this leads to me running into OutOfMemoryErrors at roughly 16G. This is basically half of my RAM. I read that half of my ram is the default setting, so I tried increasing the size by setting the parameters -Xmx12G and -Xms12G. However this did not change anything (I might have not set them correctly through intellij though)

I am also unsure why this scenario would produce such a massive amount of data? 541 doubles should fit more than 3 million times in my RAM. I double checked my user code and it does not seem that I myself am wasting this much memory.

However in the case that this memory usage is normal, I wanted to use a MemoryMapped File. I tried to use the example given on the Website and create a Workspace but it always tells me the following exception:

Exception in thread "main" org.nd4j.linalg.workspace.ND4JWorkspaceException: Expected no workspace active in outputOfLayerDetached - Open/active workspaces: [M2]
at org.nd4j.linalg.workspace.WorkspaceUtils.assertNoWorkspacesOpen(WorkspaceUtils.java:68)
at org.deeplearning4j.nn.multilayer.MultiLayerNetwork.outputOfLayerDetached(MultiLayerNetwork.java:1207)
at org.deeplearning4j.nn.multilayer.MultiLayerNetwork.output(MultiLayerNetwork.java:2415)

Do you have any advice how to fix this issue? Also sorry if this is a very naive question, I am still in the process of learning rl4j / deeplearning4j.