Hi, people. I’m trying to load the google vectors model into DL4j, to use it later for sentiment analysis. I’m able to do it but with a big cost in RAM memory. I read about workspaces and how the memory is hold off-heap, so I tried to load it using loadStaticModel inside a workspace,just to see if my RAM consumption was lower. However, I’m not able to do it if I set my available amount of RAM to a number less of 3.4 GB (the size of Google vectors model). So, does it mean the workspaces are only for certain operations?. It’s that a bug? . the code is quite simple:
final WorkspaceConfiguration mmap = WorkspaceConfiguration.builder()
.initialSize(9500000000L)
.policyLocation(LocationPolicy.MMAP)
.policyAllocation(AllocationPolicy.STRICT)
.policyReset(ResetPolicy.ENDOFBUFFER_REACHED)
.tempFilePath("/Users/Downloads/temporalFile.temp")
.build();
final File wordVectorsFile = new File("/sentiment/GoogleNews-vectors-negative3002.bin");
try (final MemoryWorkspace ws = Nd4j.getWorkspaceManager().getAndActivateWorkspace(mmap, "M2")) {
final WordVectors wordVectors = WordVectorSerializer.loadStaticModel(wordVectorsFile);
}
Any clues?.