Hello All,
I’ve set up a mini-server that is constantly performing inferences very quickly in real time, however I also need it to perform a transform on a small amount of data. The LocalTransformExecutor is quite a bit faster than Spark (Local), however I’m finding that it doesn’t release it’s memory. Running LocalTransformExecutor.execute() over and over I eventually get:
org.apache.arrow.memory.OutOfMemoryException: Failure allocating buffer.
at io.netty.buffer.PooledByteBufAllocatorL.allocate(PooledByteBufAllocatorL.java:56)
at org.apache.arrow.memory.AllocationManager.(AllocationManager.java:90)
This is easily tested simply putting an execute in a loop. It won’t release until you exit the jvm. If I pause the data collection it won’t gc, and if put in a System.gc() it does nothing. I’ve tried many of the Nd4j static memory management commands to try to release the allocations, purgeCache etc, even tried to set up workspaces. I know that workspaces has been implemented and fixed similar problems with train/inferencing, but I don’t think it’s set up on Local Datavec. From what I can tell, it seems to be arrow holding onto references, but I’m not sure.
Does any know any simple way to get it to release it’s memory, or am I waiting for fix here (or trying to fix it myself)?
Thanks in advance.