Я получаю сообщение об ошибке, которое не полезное атрибут: «nonetype 'объект не имеет атрибута' from_pretrain '. Цените любую помощь!
Пример репо:
https://huggingface.co/cardiffnlp/twitt ... atest/tree /main
code
from transformers import pipeline, TFPreTrainedModel, AutoTokenizer
import os
dir = "./models/twitter-roberta-base-sentiment-latest/"
print(os.listdir(dir)) # confirm the folder contents
model = TFPreTrainedModel.from_pretrained(dir)
tokenizer = AutoTokenizer.from_pretrained(dir)
analyze = pipeline(task="sentiment-analyis", model=model, tokenizer=tokenizer)
print(analyze("this is good"))
print(analyze("this is bad"))
output
2025-02-21 16:40:05.896448: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-02-21 16:40:06.653841: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
WARNING:tensorflow:From C:\Users\xxxxx\.pyenv\pyenv-win\versions\3.12.8\Lib\site-packages\tf_keras\src\losses.py
['config.json', 'gitattributes', 'merges.txt', 'pytorch_model.bin', 'README.md', 'special_tokens_map.json', 'tf_model.h5', 'vocab.json']
Traceback (most recent call last):
File "C:\Users\xxxxx\OneDrive - DuPont\Python Projects\huggingface\sentiment.py", line 8, in
model = TFPreTrainedModel.from_pretrained(dir)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\xxxxx\.pyenv\pyenv-win\versions\3.12.8\Lib\site-packages\transformers\modeling_tf_utils.py", line 2726, in from_pretrained
config, model_kwargs = cls.config_class.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'from_pretrained'
docs
https://huggingface.co/docs/transformer ... n_classes/ Model#Transformers.tfpretraindModel < /p>
pretrained_model_name_or_path (str, optional) — Can be either:
A string, the model id of a pretrained model hosted inside a model repo on huggingface.co.
*A path to a directory containing model weights saved using save_pretrained(), e.g., ./my_model_directory/.*
A path or url to a PyTorch state_dict save file (e.g, ./pt_model/pytorch_model.bin). In this case, from_pt should be set to True and a configuration object should be provided as config argument. This loading path is slower than converting the PyTorch model in a TensorFlow model using the provided conversion scripts and loading the TensorFlow model afterwards.
None if you are both providing the configuration and state dictionary (resp. with keyword arguments config and state_dict).
Подробнее здесь: https://stackoverflow.com/questions/794 ... downloaded