Why BertInferenceExample can only infer 4 sentence pair at once

i wonder how to infer more sentencePair at once.

//Four sentence pairs to run inference on**
        List<Pair<String, String>> sentencePairs = new ArrayList<>();
        sentencePairs.add(new Pair<>("The broader Standard & Poor's 500 Index <.SPX> was 0.46 points lower, or 0.05 percent, at 997.02.", "The technology-laced Nasdaq Composite Index .IXIC was up 7.42 points, or 0.45 percent, at 1,653.44."));
        sentencePairs.add(new Pair<>("Shares in BA were down 1.5 percent at 168 pence by 1420 GMT, off a low of 164p, in a slightly stronger overall London market.", "Shares in BA were down three percent at 165-1/4 pence by 0933 GMT, off a low of 164 pence, in a stronger market."));
        sentencePairs.add(new Pair<>("Last year, Comcast signed 1.5 million new digital cable subscribers.", "Comcast has about 21.3 million cable subscribers, many in the largest U.S. cities."));
        sentencePairs.add(new Pair<>("Revenue rose 3.9 percent, to $1.63 billion from $1.57 billion.", "The McLean, Virginia-based company said newspaper revenue increased 5 percent to $1.46 billion."));
        sentencePairs.add(new Pair<>("Revenue rose 3.9 percent, to $1.63 billion from $1.57 billion.", "The McLean, Virginia-based company said newspaper revenue increased 5 percent to $1.46 billion."));

if i add a sentencePair to it, their will be an error

The imported Bert model here expects a fixed batch size of 4 - that is why it can only infer 4 sentence pairs at once.

Because of this the bert iterator is configured to always produce this batch size. When you feed more than 4 sentence pairs to bertIter.featurizeSentencePairs(sentencePairs) it breaks, because it doesn’t know how to deal with the fifth sentence.

In any case we should throw a more useful exception here.

the batch size is fixed to 4. i tried to change the batch size to 5, it doesn’t work. is there a probability to change the batch size to a dynamic size or manual operation to a larger size to infer more sentence pair at once?

Unfortunately that depends on the model - the bert model that is imported in that examples is fixed to a batch size of 4.

oh, i see, maybe i could try change the batch size at TensorFlow, then export the model by myself.