runtime error
Exit code: 1. Reason: Unused kwargs: ['_load_in_4bit', '_load_in_8bit', 'quant_method']. These kwargs are not used in <class 'transformers.utils.quantization_config.BitsAndBytesConfig'>. g++ (Debian 12.2.0-14) 12.2.0 Copyright (C) 2022 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. Traceback (most recent call last): File "/home/user/app/app.py", line 46, in <module> gemma_model = AutoModelForCausalLM.from_pretrained(model_id) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3657, in from_pretrained hf_quantizer.validate_environment( File "/usr/local/lib/python3.10/site-packages/transformers/quantizers/quantizer_bnb_4bit.py", line 82, in validate_environment validate_bnb_backend_availability(raise_exception=True) File "/usr/local/lib/python3.10/site-packages/transformers/integrations/bitsandbytes.py", line 557, in validate_bnb_backend_availability return _validate_bnb_multi_backend_availability(raise_exception) File "/usr/local/lib/python3.10/site-packages/transformers/integrations/bitsandbytes.py", line 498, in _validate_bnb_multi_backend_availability available_devices.discard("cpu") # Only Intel CPU is supported by BNB at the moment AttributeError: 'frozenset' object has no attribute 'discard'
Container logs:
Fetching error logs...