Keras Custom Layer

I have a keras custom dense layer developed using python.

    class CustomDense(tf.keras.layers.Layer):
        def __init__(self, num_units, activation = "sigmoid", **kwargs):
            super(CustomDense, self).__init__()
    
            self.num_units = num_units
            self.activation = Activation(activation)
            super(CustomDense, self).__init__(**kwargs)
            
        def get_config(self):
            config = super(CustomDense, self).get_config()
            config.update({"num_units": self.num_units, "activation" : self.activation})
            return config
    
        def build(self, input_shape):
    
            self.weight = self.add_weight(shape = [128, self.num_units], dtype = 'float32', name = 'other_features_weights')
            self.weight1 = self.add_weight(shape = [1, self.num_units], initializer = tf.keras.initializers.Constant(1.),
                                     dtype = 'float32',
                                     trainable = False, name = 'training_bid_weights')
       
            self.bias = self.add_weight(shape = [self.num_units], name = 'bias_weights')
    
        def call(self, input):
            self.final_weight = tf.concat((self.weight1, self.weight), axis = 0)                                
            y = tf.matmul(input, self.final_weight) + self.bias
            y = self.activation(y)
            return y

I am having issues in creating above custom dense layer in java.

If anyone knows how to do it , it would help me a lot

**Note: I am new to java

@Vamshi did you try to actually do anything yet? We have how to do the lambda layer and how to register it there.
I don’t see any indicator you really read the example.

I dont think we can use lambda layer because my custom kayer in python is a modified dense layer and I am doing the same thing in Java too.

public class CustomDense extends KerasLayer {

/* Keras layer parameter names. */
private int numTrainableParams = 2;
private boolean hasBias;

/**
 * Pass-through constructor from KerasLayer
 *
 * @param kerasVersion major keras version
 * @throws UnsupportedKerasConfigurationException Unsupported Keras config
 */
public CustomDense(Integer kerasVersion) throws UnsupportedKerasConfigurationException {
    super(kerasVersion);
}

/**
 * Constructor from parsed Keras layer configuration dictionary.
 *
 * @param layerConfig dictionary containing Keras layer configuration
 * @throws InvalidKerasConfigurationException     Invalid Keras config
 * @throws UnsupportedKerasConfigurationException Unsupported Keras config
 */

// public CustomDense(Map<String, Object> layerConfig)
// throws InvalidKerasConfigurationException, UnsupportedKerasConfigurationException {
// this(layerConfig, true);
// }

/**
 * Constructor from parsed Keras layer configuration dictionary.
 *
 * @param layerConfig           dictionary containing Keras layer configuration
 * @param enforceTrainingConfig whether to enforce training-related configuration options
 * @throws InvalidKerasConfigurationException     Invalid Keras config
 * @throws UnsupportedKerasConfigurationException Unsupported Keras config
 */
public CustomDense(Map<String, Object> layerConfig)
        throws InvalidKerasConfigurationException, UnsupportedKerasConfigurationException {
    super(layerConfig);
    hasBias = KerasLayerUtils.getHasBiasFromConfig(layerConfig, conf);
    numTrainableParams = hasBias ? 2 : 1;

    LayerConstraint biasConstraint = KerasConstraintUtils.getConstraintsFromConfig(
            layerConfig, conf.getLAYER_FIELD_B_CONSTRAINT(), conf, kerasMajorVersion);
    LayerConstraint weightConstraint = KerasConstraintUtils.getConstraintsFromConfig(
            layerConfig, conf.getLAYER_FIELD_W_CONSTRAINT(), conf, kerasMajorVersion);

    IWeightInit init = KerasInitilizationUtils.getWeightInitFromConfig(layerConfig, conf.getINIT_GLOROT_UNIFORM(),
            true, conf, kerasMajorVersion);
    
    IWeightInit init1 = KerasInitilizationUtils.getWeightInitFromConfig(layerConfig, conf.getINIT_CONSTANT(),
            false, conf, kerasMajorVersion);
    
    

    DenseLayer.Builder builder = new DenseLayer.Builder().name(this.layerName)
            .nOut(KerasLayerUtils.getNOutFromConfig(layerConfig, conf))
            .activation(Activation.SIGMOID)
            .weightInit(init_final)
            .biasInit(0.0)
            .hasBias(hasBias);
    if (biasConstraint != null)
        builder.constrainBias(biasConstraint);
    if (weightConstraint != null)
        builder.constrainWeights(weightConstraint);
    this.layer = builder.build();
}

/**
 * Get DL4J DenseLayer.
 *
 * @return DenseLayer
 */
public DenseLayer getDenseLayer() {
    return (DenseLayer) this.layer;
}

/**
 * Get layer output type.
 *
 * @param inputType Array of InputTypes
 * @return output type as InputType
 * @throws InvalidKerasConfigurationException Invalid Keras config
 */
@Override
public InputType getOutputType(InputType... inputType) throws InvalidKerasConfigurationException {
    /* Check whether layer requires a preprocessor for this InputType. */
    InputPreProcessor preprocessor = getInputPreprocessor(inputType[0]);
    if (preprocessor != null) {
        return this.getDenseLayer().getOutputType(-1, preprocessor.getOutputType(inputType[0]));
    }
    return this.getDenseLayer().getOutputType(-1, inputType[0]);
}

/**
 * Returns number of trainable parameters in layer.
 *
 * @return number of trainable parameters (2)
 */
@Override
public int getNumParams() {
    return numTrainableParams;
}

/**
 * Set weights for layer.
 *
 * @param weights Dense layer weights
 */
@Override
public void setWeights(Map<String, INDArray> weights) throws InvalidKerasConfigurationException {
    this.weights = new HashMap<>();
    if (weights.containsKey(conf.getKERAS_PARAM_NAME_W()))
        this.weights.put(DefaultParamInitializer.WEIGHT_KEY, weights.get(conf.getKERAS_PARAM_NAME_W()));
    else
        throw new InvalidKerasConfigurationException(
                "Parameter " + conf.getKERAS_PARAM_NAME_W() + " does not exist in weights");
    if (hasBias) {
        if (weights.containsKey(conf.getKERAS_PARAM_NAME_B()))
            this.weights.put(DefaultParamInitializer.BIAS_KEY, weights.get(conf.getKERAS_PARAM_NAME_B()));
        else
            throw new InvalidKerasConfigurationException(
                    "Parameter " + conf.getKERAS_PARAM_NAME_B() + " does not exist in weights");
    }
    KerasLayerUtils.removeDefaultWeights(weights, conf);
}

}

I have initialised weights like below in java :

    IWeightInit init = KerasInitilizationUtils.getWeightInitFromConfig(layerConfig, conf.getINIT_GLOROT_UNIFORM(),
                    true, conf, kerasMajorVersion);
    
    IWeightInit init1 = KerasInitilizationUtils.getWeightInitFromConfig(layerConfig, conf.getINIT_CONSTANT(),
            false, conf, kerasMajorVersion);

for python code :

    self.weight = self.add_weight(shape = [128, self.num_units], dtype = 'float32', name = 'other_features_weights')
    self.weight1 = self.add_weight(shape = [1, self.num_units], initializer = tf.keras.initializers.Constant(1.),
                                 dtype = 'float32',
                                 trainable = False, name = 'training_bid_weights')

Now i need to concat this weights like :

 self.final_weight = tf.concat((self.weight1, self.weight), axis = 0) 

@Vamshi perfect this helps a lot!

You can do anything you want with the SameDiffLambdaLayer. SameDiff is a tensorflow like framework that supports all of the same ops you’d expect TF/pytorch to support and is a lower level api. What you’re using is the dl4j interface.

Keras uses that higher level interface for its networks and you embed samediff lambda layers in order to do what you need.

You would not extend dl4j’s dense layer here. The example shows that. So in this case just define your custom layer as a self contained samediff graph.

What you’d want to do is define your ops in terms of that samediff graph. Practice that separately, put that in to your lambda layer, then register it.

In your case, there are several namespaces you can use.

You can do something like:

SameDiff sd = SameDiff.create();
sd.nn()....
sd.math()..

There are various namespaces there you can use to setup various ops.

I have read this in the konduit documentation:

SameDiffLambdaLayer Use this approach if your layer doesn’t have any weights and defines just a computation. It is most useful when you have to define a custom layer because you are using a lambda in your model definition. This is the approach you should be using when you’ve gotten the exception about no lambda layer being found.

But my custom layer has weights, actually two different weight tensors thats why i went with KerasLayer.

@Vamshi samediff allows you to declare variables too. That includes placeholders etc as well.

In your case, I would recommend just initializing the weights.

@Vamshi for constants or anything, you can specify various weight initializers or even specify nedarrays you nitialize manually and pass those to a samediff.var(…). You can literally do anything.

@agibsonccc I understood but i have one question.

My custom layer developed in python is traditional dense layer not using lambda function. So i getting confused about using samediff.

Sorry for any trouble, I am new to java

@Vamshi you can use samediff’s dense layer with any initialization you want. You can also specify the weight initialization as well.

You can also manually create dense layers using input * w + b as well.
The only complexity should really be making sure everything matches.

Start by creating your method in python and in java first and just making sure the layer outputs are correct. Then wire it in second.

Okay understood.
I will work on it .
Thank you so much.

@Vamshi sure thanks for trying! If you need help or find something missing let me know. For now use that test to see how to set some things up and take it one step at a time. We have examples here as well: deeplearning4j-examples/README.md at 686db99fee3d4825ee70663e1a15aa8d6216f2c2 · deeplearning4j/deeplearning4j-examples · GitHub