transformers报错

#87
by zhujj0579 - opened

transformers 4.44.0报错:
Traceback (most recent call last):
File "/share/home/aim/aim_zhujj/glmlor_st.py", line 175, in
outputs = model.generate(
^^^^^^^^^^^^^^^
File "/data/aim_nuist/aim_zhujj/.conda/envs/new_env_name/lib/python3.12/site-packages/peft/peft_model.py", line 1838, in generate
outputs = self.base_model.generate(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/data/aim_nuist/aim_zhujj/.conda/envs/new_env_name/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/data/aim_nuist/aim_zhujj/.conda/envs/new_env_name/lib/python3.12/site-packages/transformers/generation/utils.py", line 2024, in generate
result = self._sample(
^^^^^^^^^^^^^
File "/data/aim_nuist/aim_zhujj/.conda/envs/new_env_name/lib/python3.12/site-packages/transformers/generation/utils.py", line 3032, in _sample
model_kwargs = self._update_model_kwargs_for_generation(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/share/home/aim/aim_zhujj/.cache/huggingface/modules/transformers_modules/glm-4-9b-chat/modeling_chatglm.py", line 812, in _update_model_kwargs_for_generation
model_kwargs["past_key_values"] = self._extract_past_from_model_output(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: GenerationMixin._extract_past_from_model_output() got an unexpected keyword argument 'standardize_cache_format'

transformers 4.46.0及transformers 4.49.0则报错:
Traceback (most recent call last):
File "/share/home/aim/aim_zhujj/glmlor_st.py", line 168, in
input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to(device)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/data/aim_nuist/aim_zhujj/.conda/envs/new_env_name/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 3021, in call
encodings = self._call_one(text=text, text_pair=text_pair, **all_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/data/aim_nuist/aim_zhujj/.conda/envs/new_env_name/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 3131, in _call_one
return self.encode_plus(
^^^^^^^^^^^^^^^^^
File "/data/aim_nuist/aim_zhujj/.conda/envs/new_env_name/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 3207, in encode_plus
return self._encode_plus(
^^^^^^^^^^^^^^^^^^
File "/data/aim_nuist/aim_zhujj/.conda/envs/new_env_name/lib/python3.12/site-packages/transformers/tokenization_utils.py", line 804, in _encode_plus
return self.prepare_for_model(
^^^^^^^^^^^^^^^^^^^^^^^
File "/data/aim_nuist/aim_zhujj/.conda/envs/new_env_name/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 3703, in prepare_for_model
encoded_inputs = self.pad(
^^^^^^^^^
File "/data/aim_nuist/aim_zhujj/.conda/envs/new_env_name/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 3505, in pad
encoded_inputs = self._pad(
^^^^^^^^^^
TypeError: ChatGLM4Tokenizer._pad() got an unexpected keyword argument 'padding_side'

Sign up or log in to comment