-
Notifications
You must be signed in to change notification settings - Fork 31
Open
Description
Using any OPT model from the HuggingFace Transformers library (ex: https://huggingface.co/facebook/opt-350m) as a generator currently raises an exception when attempting to generate suggestions:
Error:
File ".../python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 235, in forward
raise ValueError(
ValueError: Attention mask should be of size (1, 1, 0, 52), but is torch.Size([1, 1, 1, 1]
)
Workaround: We're actively looking into this, and recommend using GPT Neo (https://huggingface.co/docs/transformers/model_doc/gpt_neo) as an alternative:
import adatest
import transformers
# gen_model = "facebook/opt-125m" # Currently unsupported
gen_model = "EleutherAI/gpt-neo-125M"
opt_gen = transformers.pipeline('text-generation', model=gen_model)
generator = adatest.generators.Transformers(opt_gen.model, opt_gen.tokenizer)
adatest.serve(tests(model, generator, auto_save=True))parulnith
Metadata
Metadata
Assignees
Labels
No labels