What wrong with the setHasLayerNorm API?

examples LeNetMNIST class , I set setHasLayerNorm true in the DenseLayer.
like this
DenseLayer.Builder denseBuild = new DenseLayer.Builder( ).activation( Activation.RELU ).nOut( 4 );
denseBuild.setHasLayerNorm( true );
Run the LeNetMNIST class ,Then throw the exception
o.n.l.c.n.o.NativeOpExecutioner - Failed to execute op layer_norm. Attempted to execute with 2 inputs, 1 outputs, 0 targs,1 bargs and 1 iargs. Inputs: [(FLOAT,[64,500],c), (FLOAT,[1,500],f)]. Outputs: [(FLOAT,[64,500],c)]. tArgs: -. iArgs: [1]. bArgs: [true]. Op own name: “7a4ba890-7101-4fde-a97f-a646cb5102f4” - Please see above message (printed out from c++) for a possible cause of error.
Exception in thread “main” java.lang.RuntimeException: Op [layer_norm] execution failed
at org.nd4j.linalg.cpu.nativecpu.ops.NativeOpExecutioner.exec(NativeOpExecutioner.java:1594)
at org.deeplearning4j.nn.layers.BaseLayer.preOutputWithPreNorm(BaseLayer.java:323)

I test the LayerNorm and LayerNormBp OP, the gain INDArray need shape is [500] not [1, 500]
What happen with this?I use the API not correct?