How to mask a particular value (-1) in LSTM for representing missing values

Hi, I am working with LSTM that uses irregularly spaced data and need to mask the missing values (represented by -1). In keras it is possible by
model.add(Masking(mask_value=-1, input_shape=(n_steps, n_features)))
Can I please know how this can be done?

In DL4J it is done with explicit masking tensors. It is documented quite extensively here:
https://deeplearning4j.konduit.ai/models/recurrent#masking-one-to-many-many-to-one-and-sequence-classification

As far as I’m aware of, we don’t have a direct input masking by value implementation that makes it easier to use, but I guess that it shouldn’t be too hard to implement it yourself as a DataSetPreProcessor.

I’ve successfully used these masking capabilities with an LSTM. The main use always takes input, labels, inputsmask, labelsmask. Sometimes the input is masked and sometimes the output is masked (when that particular output can’t be trained for, for example). Can be slightly tedious remembering what each dimension of your arrays are, but slicing them as you get more specific really helps (eg when calculating masks for just one set of data within an MBS).

Good luck!