Похоже, это связано с проблемой синтаксического анализа переменных среды (NoneType не имеет разделения атрибутов), но я не могу точно определить, какая переменная вызывает сбой внутри библиотеки преобразователей.
Сведения о среде:
- Python: 3.11 (работает в Docker)
- TensorFlow: 2.15.0
- tf-keras: 2.15.1
- Трансформеры: 4.39.3
- Преобразователи предложений: 2.7.0
- Keras (автономный): удален (я проверил, что в списке пунктов отображается только tf-keras).
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 694, in lifespan
async with self.lifespan_context(app) as maybe_state:
File "/usr/local/lib/python3.11/contextlib.py", line 210, in __aenter__
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/app/api_server.py", line 138, in lifespan
app.state.rag = get_rag()
^^^^^^^^^
File "/app/api_server.py", line 120, in get_rag
from modules.rag.graph import RAGPipeline
File "/app/modules/rag/__init__.py", line 3, in
from .reranker import CrossEncoderReranker
File "/app/modules/rag/reranker.py", line 8, in
from sentence_transformers import CrossEncoder
File "/usr/local/lib/python3.11/site-packages/sentence_transformers/__init__.py", line 3, in
from .datasets import SentencesDataset, ParallelSentencesDataset
File "/usr/local/lib/python3.11/site-packages/sentence_transformers/datasets/__init__.py", line 3, in
from .ParallelSentencesDataset import ParallelSentencesDataset
File "/usr/local/lib/python3.11/site-packages/sentence_transformers/datasets/ParallelSentencesDataset.py", line 4, in
from .. import SentenceTransformer
File "/usr/local/lib/python3.11/site-packages/sentence_transformers/SentenceTransformer.py", line 38, in
from .models import Transformer, Pooling, Normalize
File "/usr/local/lib/python3.11/site-packages/sentence_transformers/models/__init__.py", line 1, in
from .Transformer import Transformer
File "/usr/local/lib/python3.11/site-packages/sentence_transformers/models/Transformer.py", line 2, in
from transformers import AutoModel, AutoTokenizer, AutoConfig, T5Config, MT5Config
File "", line 1229, in _handle_fromlist
File "/usr/local/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1463, in __getattr__
value = getattr(module, name)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1462, in __getattr__
module = self._get_module(self._class_to_module[name])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1474, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.models.auto.modeling_auto because of the following error (look up to see its traceback):
'NoneType' object has no attribute 'split'
ERROR: Application startup failed. Exiting.
What I have tried so far:
I suspected a Keras 2 vs Keras 3 conflict, so I uninstalled keras and kept tf-keras.
I added the following Environment Variables in docker-compose.yml and explicitly inside my Python script (os.environ):
os.environ["TRANSFORMERS_NO_TF"] = "1"
os.environ["TF_USE_LEGACY_KERAS"] = "1"
os.environ["CUDA_VISIBLE_DEVICES"] = "-1" # Suspected this was None, so I set it to -1
I verified that pip list shows tensorflow==2.15.0 and tf-keras==2.15.1.
Despite setting CUDA_VISIBLE_DEVICES to "-1", the error persists at the exact same import line. Does anyone know which specific environment variable transformers v4.39.3 parses with .split() that might be returning None in a Docker environment?
Подробнее здесь: https://stackoverflow.com/questions/798 ... -attribute
Мобильная версия