A very simple case of linear regression

Hello! Could somebody please help me with a very simple case of linear regression and DL4j? I have just spent already several hours trying to reach what I did in python and tensorflow in a half of an hour.
So, I try to make a linear regression for a function y = w * x + b, where x is between 0 and 100, b is about 0.5 and b is 1.4. I generated input data:

    int batch = 100;
    var w = 0.5;
    var b = 1.4;
    INDArray x = Nd4j.linspace(0, 100, 100).reshape(batch, 1);
    // Generate y with some noise
    INDArray y = x
        .mul(w)
        .add(b)
        .add(Nd4j.random().normal(2.0, 1, DOUBLE, batch, 1));

Then I would like to create a very simple network with only one layer and 1 input (x) and one output (y). So, my goal is to get y1 in the same scale as y!
Also I nderstand that I must use normalizer to help neetwork works better, so I’ve chosen:

    NormalizerStandardize normalizer = new NormalizerStandardize();
    normalizer.fitLabel(true);

    DataSet inputDs = new DataSet(x, y);
    normalizer.fit(inputDs);
    var iterator = new ListDataSetIterator<>(List.of(inputDs));
    iterator.setPreProcessor(normalizer);

then my network:

    var conf = new NeuralNetConfiguration.Builder()
        .seed(10)
        .weightInit(WeightInit.XAVIER)
        .updater(new Nesterovs(0.01, 0.9))
        .list()
        .layer(0, new OutputLayer.Builder(LossFunctions.LossFunction.MSE)
            .biasInit(b)
            .activation(Activation.IDENTITY)
            .nIn(1)
            .nOut(1).build())
        .build();
    var network = new MultiLayerNetwork(conf);
    network.init();

training:

    for (int epoch = 0; epoch < 201; epoch++) {
        iterator.reset();
        network.fit(iterator);
    }

and output:

    var y1 = network.output(x, false);
    normalizer.revertLabels(y1);

and I get just wrong result - everything in only point on plot. When I tried to transform the input data in normalizer - I got a wrong scale for y1.
So, I couldn’t manage to get correct y1 in the same scale as y.
When I transform the input data by normalizer and then get y1 it works perfectly, the y1 is the line as expected, but I couldn’t revert everything back to the original scale.
Could someone please provide working example for such a simple task. Thank you in adavance!

@roman toy examples have issues because we do division by the batch size for the learning by default in the higher level dl4j api.
I know it’s not a good “hello world” experience but it is part of the older api.
That’s standard practice in minibatch learning.

TF and co’s equivalent within dl4j is actually samediff if you want a declarative pai.

Set minibatch(false) in your configuration and it will not do that division.

@agibsonccc thank you for the reply.
Yes, I found some examples with samediff and you have just confirmed that this is similar to TF, I will try. Also I will try to disable minibatching in mulitlayer network. To be honest I finally managed to get test working, but I had to transform the initial dataset and then restore features and labels and labels’ manually. It is not very convenient and intuitive way, but I have found it at least.