Hello! Could somebody please help me with a very simple case of linear regression and DL4j? I have just spent already several hours trying to reach what I did in python and tensorflow in a half of an hour.
So, I try to make a linear regression for a function y = w * x + b, where x is between 0 and 100, b is about 0.5 and b is 1.4. I generated input data:
int batch = 100;
var w = 0.5;
var b = 1.4;
INDArray x = Nd4j.linspace(0, 100, 100).reshape(batch, 1);
// Generate y with some noise
INDArray y = x
.mul(w)
.add(b)
.add(Nd4j.random().normal(2.0, 1, DOUBLE, batch, 1));
Then I would like to create a very simple network with only one layer and 1 input (x) and one output (y). So, my goal is to get y1 in the same scale as y!
Also I nderstand that I must use normalizer to help neetwork works better, so I’ve chosen:
NormalizerStandardize normalizer = new NormalizerStandardize();
normalizer.fitLabel(true);
DataSet inputDs = new DataSet(x, y);
normalizer.fit(inputDs);
var iterator = new ListDataSetIterator<>(List.of(inputDs));
iterator.setPreProcessor(normalizer);
then my network:
var conf = new NeuralNetConfiguration.Builder()
.seed(10)
.weightInit(WeightInit.XAVIER)
.updater(new Nesterovs(0.01, 0.9))
.list()
.layer(0, new OutputLayer.Builder(LossFunctions.LossFunction.MSE)
.biasInit(b)
.activation(Activation.IDENTITY)
.nIn(1)
.nOut(1).build())
.build();
var network = new MultiLayerNetwork(conf);
network.init();
training:
for (int epoch = 0; epoch < 201; epoch++) {
iterator.reset();
network.fit(iterator);
}
and output:
var y1 = network.output(x, false);
normalizer.revertLabels(y1);
and I get just wrong result - everything in only point on plot. When I tried to transform the input data in normalizer - I got a wrong scale for y1.
So, I couldn’t manage to get correct y1 in the same scale as y.
When I transform the input data by normalizer and then get y1 it works perfectly, the y1 is the line as expected, but I couldn’t revert everything back to the original scale.
Could someone please provide working example for such a simple task. Thank you in adavance!