Skip to content

[BUG] Transformers 5.0.0 is raising an error with rag fiqa local Notebook #162

@humaira-rf

Description

@humaira-rf

Bug Description

Transformers 5.0.0 raises an error when running the local(non-colab) RAG FIQA tutorial notebook.

To Reproduce

Steps to reproduce the behavior:

  • Install the latest release with Transformers 5.0.0
  • Navigate to tutorial_notebooks/rag-contexteng/trackio/
  • Run rf-tutorial-rag-fiqa-trackio.ipynb
    Error occurs during execution

Error logs

Traceback (most recent call last): File "/home/ubuntu/rapidfireai/rapidfireai/evals/actors/query_actor.py", line 140, in initialize_for_pipeline self.inference_engine = engine_class(**engine_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/ubuntu/rapidfireai/rapidfireai/evals/actors/inference_engines.py", line 64, in __init__ self.llm = LLM(**model_config) ^^^^^^^^^^^^^^^^^^^ File "/home/ubuntu/.venv/lib/python3.12/site-packages/vllm/entrypoints/llm.py", line 297, in __init__ self.llm_engine = LLMEngine.from_engine_args( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/ubuntu/.venv/lib/python3.12/site-packages/vllm/v1/engine/llm_engine.py", line 177, in from_engine_args return cls(vllm_config=vllm_config, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/ubuntu/.venv/lib/python3.12/site-packages/vllm/v1/engine/llm_engine.py", line 96, in __init__ self.tokenizer = init_tokenizer_from_configs( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/ubuntu/.venv/lib/python3.12/site-packages/vllm/transformers_utils/tokenizer.py", line 286, in init_tokenizer_from_configs return get_tokenizer( ^^^^^^^^^^^^^^ File "/home/ubuntu/.venv/lib/python3.12/site-packages/vllm/transformers_utils/tokenizer.py", line 256, in get_tokenizer tokenizer = get_cached_tokenizer(tokenizer) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/ubuntu/.venv/lib/python3.12/site-packages/vllm/transformers_utils/tokenizer.py", line 99, in get_cached_tokenizer tokenizer.all_special_tokens_extended) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/ubuntu/.venv/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 1326, in __getattr__ raise AttributeError(f"{self.__class__.__name__} has no attribute {key}") AttributeError: Qwen2Tokenizer has no attribute all_special_tokens_extended. Did you mean: 'num_special_tokens_to_add'? 2026-02-02 18:52:47 | Experiment | ERROR | experiment.py:422 | [exp1-fiqa-rag-trackio_1:Experiment] Error running multi-config experiment Traceback (most recent call last): File "/home/ubuntu/rapidfireai/rapidfireai/experiment.py", line 411, in run_evals results = self.controller.run_multi_pipeline_inference( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/ubuntu/rapidfireai/rapidfireai/evals/scheduling/controller.py", line 1414, in run_multi_pipeline_inference ray.get( File "/home/ubuntu/.venv/lib/python3.12/site-packages/ray/_private/auto_init_hook.py", line 22, in auto_init_wrapper return fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File "/home/ubuntu/.venv/lib/python3.12/site-packages/ray/_private/client_mode_hook.py", line 104, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/home/ubuntu/.venv/lib/python3.12/site-packages/ray/_private/worker.py", line 2967, in get values, debugger_breakpoint = worker.get_objects( ^^^^^^^^^^^^^^^^^^^ File "/home/ubuntu/.venv/lib/python3.12/site-packages/ray/_private/worker.py", line 1015, in get_objects raise value.as_instanceof_cause() ray.exceptions.RayTaskError(RuntimeError): �[36mray::QueryProcessingActor.initialize_for_pipeline()�[39m (pid=24938, ip=10.138.15.207, actor_id=9bbd0a5f6cae851e2c38a54701000000, repr=<rapidfireai.evals.actors.query_actor.QueryProcessingActor object at 0x7359cbff59a0>) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/ubuntu/rapidfireai/rapidfireai/evals/actors/query_actor.py", line 241, in initialize_for_pipeline raise RuntimeError( RuntimeError: Failed to initialize pipeline: AttributeError: Qwen2Tokenizer has no attribute all_special_tokens_extended

Environment

Transformers: 5.0.0 (issue present) / 4.57.6 (working)
vLLM: 0.11.0
PyTorch: 2.8.0+cu128
Python version: Python 3.12

Code seems running correctly with the last transformer version 4.57.6. Code is working with the colab-notebook as it was installing 4.57.6 by default.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions