You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I run generate.py with no changes (except removing pdb) this is the output I see:
The pre-trained model you are loading is a cased model but you have not set `do_lower_case` to False. We are setting `do_lower_case=False` for you but you may want to check this behavior.
Decoding strategy sequential, argmax at each step
Iteration 0: this is a sentence .
BERT prediction: . is a . .
Iteration 1: . is a sentence .
BERT prediction: . is a . .
Iteration 2: . is a sentence .
BERT prediction: . is a . .
Iteration 3: . is a sentence .
BERT prediction: . is a . .
Iteration 4: . is a . .
BERT prediction: . . a . .
Final: . is a . .
That doesn't seem to be the desired result. Why so many periods? If this is expected, can you give me an example of an input and configuration that will find the correct answer?
The text was updated successfully, but these errors were encountered:
When I run generate.py with no changes (except removing pdb) this is the output I see:
That doesn't seem to be the desired result. Why so many periods? If this is expected, can you give me an example of an input and configuration that will find the correct answer?
The text was updated successfully, but these errors were encountered: