Correct resetable DataSetIterator for a Dataset?

i’ve been using new IteratorDataSetIterator(trainingData.iterator(),batchSize) with my models up till now which has been fine but now I am using a much larger training set of 1 million examples which is too big to normalize in one go by running as I run out of GPU memory>

Exception in thread “main” java.lang.RuntimeException: Memory allocation for tadOffsets failed; Error code: [2]
at org.nd4j.linalg.jcublas.ops.executioner.CudaExecutioner.tadShapeInfoAndOffsets(
at org.nd4j.jita.allocator.tad.BasicTADManager.getTADOnlyShapeInfo(
at org.nd4j.jita.allocator.tad.DeviceTADManager.getTADOnlyShapeInfo(
at org.nd4j.linalg.jcublas.ops.executioner.CudaExecutioner.exec(
at org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner.execAndReturn(
at org.nd4j.linalg.dataset.api.preprocessor.StandardizeStrategy.preProcess(
at org.nd4j.linalg.dataset.api.preprocessor.StandardizeStrategy.preProcess(
at org.nd4j.linalg.dataset.api.preprocessor.AbstractDataSetNormalizer.transform(
at org.nd4j.linalg.dataset.api.preprocessor.AbstractDataSetNormalizer.preProcess(
at org.nd4j.linalg.dataset.api.preprocessor.AbstractDataSetNormalizer.transform(
at org.nd4j.linalg.dataset.api.preprocessor.AbstractDataSetNormalizer.transform(

Unfortunatly neither IteratorDataSetIterator nor trainingData.iterateWithMiniBatches() works with as they are not resetable

Exception in thread “main” java.lang.NullPointerException: Cannot invoke “org.nd4j.linalg.dataset.api.iterator.DataSetIterator.reset()” because “iterator” is null

i’ve been looking at all the DataSetIterators available at DataSetIterator (deeplearning4j 1.0.0-beta7 API)

but can’t seem to find one that is applicable for just a vanilla Dataset.

Any recommendations?

on a side note, I’ve noticed that when I save my normalizer settings after using fit() using >
StandardizeSerializerStrategy ns = new StandardizeSerializerStrategy();
try {
ns.write(normalizer, new FileOutputStream(new File("/")));
} catch (IOException e) {

the resulting file is only 267 bytes! is this how large its meant to be?

Thanks in advance