Hi everyone. I made a simple example to get to know deeplearning4j:
- the data has the shape Map<Double, Double>, with the first Double representing the features: 2 numbers chosen randomly between 0 and 100. The last Double is the target: 1 if they together are above 50, 0 if not.
- Here is how I make the Iterator:
public static DataSetIterator fromEntriesToDataSet(Map<Double[], Double> entries) {
final Double[][] boxedDoubles = entries.keySet().toArray(new Double[entries.size()][numFeatures]);
double[][] unboxedDoubles = Utilities.unbox2DArrayOfDoubles(boxedDoubles, numFeatures);
INDArray inputNDArray = Nd4j.create(unboxedDoubles);
final Double[] bools = entries.values().toArray(new Double[entries.size()]);
INDArray outPut = Nd4j.create(Utilities.unbox1DArrayOfDoubles(bools), entries.size(), 1);
DataSet dataSet = new DataSet(inputNDArray, outPut);
List<DataSet> listDs = dataSet.asList();
return new ListDataSetIterator<>(listDs, entries.size());
}
- Here’s how I make the network:
static int numInput_ofNetwork = 2;
static int nHidden_ofNetwork = 110;
static int numOutputs_ofNetwork = 1;
private static MultiLayerNetwork makeNetworkModel() {
MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
.activation(Activation.TANH)
.weightInit(WeightInit.XAVIER)
.updater(new Sgd(0.1))
.l2(1e-4)
.list()
.layer(new DenseLayer.Builder().nIn(numInput_ofNetwork).nOut(nHidden_ofNetwork).build())
.layer(new OutputLayer.Builder(LossFunctions.LossFunction.XENT)
.activation(Activation.SIGMOID).nIn(nHidden_ofNetwork).nOut(numOutputs_ofNetwork).build())
.build();
MultiLayerNetwork model = new MultiLayerNetwork(conf);
model.init();
return model;
}
I assume the mistake will be somewhere in there, otherwise tell me if I need to post more code.
I wasn’t expecting the model to be outstanding on the first try. But the only-zero prediction suggests I made a mistake somewhere.