Я пытаюсь запустить один модульный тест в репозитории Apache Airflow (airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py) для целей разработки.
Я выполнил рекомендуемые шаги по настройке, но тест постоянно завершается неудачей с двумя разными ошибками (Windows/linux).
- Windows (длинно, но обрезано):
Refreshed 97 providers with 1937 Python files.
Traceback (most recent call last):
File "C:\Users\Mido Hany\VS code Projects\Habitat\airflow\scripts\ci\pre_commit\update_providers_dependencies.py", line 188, in
check_if_different_provider_used(file)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^
File "C:\Users\Mido Hany\VS code Projects\Habitat\airflow\scripts\ci\pre_commit\update_providers_dependencies.py", line 157, in check_if_different_provider_used
imports = get_imports_from_file(file_path, only_top_level=False)
File "C:\Users\Mido Hany\VS code Projects\Habitat\airflow\scripts\ci\pre_commit\common_precommit_utils.py", line 379, in get_imports_from_file
root = ast.parse(file_path.read_text(), file_path.name)
~~~~~~~~~~~~~~~~~~~^^
File "C:\Users\Mido Hany\AppData\Local\Programs\Python\Python313\Lib\pathlib\_local.py", line 546, in read_text
return PathBase.read_text(self, encoding, errors, newline)
~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Mido Hany\AppData\Local\Programs\Python\Python313\Lib\pathlib\_abc.py", line 633, in read_text
return f.read()
~~~~~~^^
File "C:\Users\Mido Hany\AppData\Local\Programs\Python\Python313\Lib\encodings\cp1252.py", line 23, in decode
return codecs.charmap_decode(input,self.errors,decoding_table)[0]
~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
UnicodeDecodeError: 'charmap' codec can't decode byte 0x81 in position 2148: character maps to
Traceback (most recent call last):
File "", line 198, in _run_module_as_main
File "", line 88, in _run_code
File "C:\Users\Mido Hany\VS code Projects\Habitat\airflow\.venv\Scripts\pytest.exe\__main__.py", line 6, in
sys.exit(console_main())
~~~~~~~~~~~~^^
...
File "C:\Users\Mido Hany\VS code Projects\Habitat\airflow\.venv\Lib\site-packages\_pytest\config\__init__.py", line 876, in import_plugin
__import__(importspec)
~~~~~~~~~~^^^^^^^^^^^^
File "", line 1360, in _find_and_load
File "", line 1331, in _find_and_load_unlocked
File "", line 935, in _load_unlocked
File "C:\Users\Mido Hany\VS code Projects\Habitat\airflow\.venv\Lib\site-packages\_pytest\assertion\rewrite.py", line 197, in exec_module
exec(co, module.__dict__)
~~~~^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Mido Hany\VS code Projects\Habitat\airflow\devel-common\src\tests_common\pytest_plugin.py", line 200, in
subprocess.check_call(["uv", "run", UPDATE_PROVIDER_DEPENDENCIES_SCRIPT.as_posix()])
~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Mido Hany\AppData\Local\Programs\Python\Python313\Lib\subprocess.py", line 419, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['uv', 'run', 'C:/Users/Mido Hany/VS code Projects/Habitat/airflow/scripts/ci/pre_commit/update_providers_dependencies.py']' returned non-zero exit status 1.
- Я попробовал использовать чистую виртуальную машину Ubuntu в Google Cloud и обнаружил еще одну ошибку (очень длинную, но сокращенную):
============================================= test session starts ==============================================
platform linux -- Python 3.11.2, pytest-9.0.1, pluggy-1.6.0 -- /home/midohany910/habitat/airflow/.venv/bin/python3
cachedir: .pytest_cache
rootdir: /home/midohany910/habitat/airflow
configfile: pyproject.toml
plugins: asyncio-1.3.0, anyio-4.12.0, time-machine-3.1.0
asyncio: mode=Mode.STRICT, debug=False, asyncio_default_fixture_loop_scope=function, asyncio_default_test_loop_scope=function
collected 4 items
airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_overflow_caps_to_configured_max__HABITAT ERROR [ 25%]
airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_overflow_caps_to_internal_max__HABITAT ERROR [ 50%]
airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_non_exponential_policy_unchanged__HABITAT ERROR [ 75%]
airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_small_interval_large_attempts_overflow__HABITAT ERROR [100%]
==================================================== ERRORS ====================================================
_______________________ ERROR at setup of test_overflow_caps_to_configured_max__HABITAT ________________________
.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py:143: in __init__
self._dbapi_connection = engine.raw_connection()
^^^^^^^^^^^^^^^^^^^^^^^
.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py
return self.pool.connect()
^^^^^^^^^^^^^^^^^^^
.venv/lib/python3.11/site-packages/sqlalchemy/pool/base.py:447: in connect
return _ConnectionFairy._checkout(self)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...
.venv/lib/python3.11/site-packages/sqlalchemy/engine/default.py:629: in connect
return self.loaded_dbapi.connect(*cargs, **cparams) # type: ignore[no-any-return] # NOQA: E501
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file
E (Background on this error at: https://sqlalche.me/e/20/e3q8)
-------------------------------------------- Captured stdout setup ---------------------------------------------
========================= AIRFLOW ==========================
Home of the user: /home/midohany910
Airflow home /home/midohany910/airflow
Initializing the DB - first time after entering the container.
Initialization can be also forced by adding --with-db-init flag when running tests.
[2025-11-30T20:05:55.919+0000] {db.py:1146} INFO - Dropping Airflow tables that exist
---------------------------------------------- Captured log setup ----------------------------------------------
INFO airflow.utils.db:db.py:1146 Dropping Airflow tables that exist
________________________ ERROR at setup of test_overflow_caps_to_internal_max__HABITAT _________________________
.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py:143: in __init__
self._dbapi_connection = engine.raw_connection()
^^^^^^^^^^^^^^^^^^^^^^^
.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py
return self.pool.connect()
^^^^^^^^^^^^^^^^^^^
.venv/lib/python3.11/site-packages/sqlalchemy/pool/base.py:447: in connect
return _ConnectionFairy._checkout(self)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...
.venv/lib/python3.11/site-packages/sqlalchemy/engine/create.py:661: in connect
return dialect.connect(*cargs, **cparams)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.venv/lib/python3.11/site-packages/sqlalchemy/engine/default.py:629: in connect
return self.loaded_dbapi.connect(*cargs, **cparams) # type: ignore[no-any-return] # NOQA: E501
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file
E (Background on this error at: https://sqlalche.me/e/20/e3q8)
_______________________ ERROR at setup of test_non_exponential_policy_unchanged__HABITAT _______________________
.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py:143: in __init__
self._dbapi_connection = engine.raw_connection()
^^^^^^^^^^^^^^^^^^^^^^^
...
.venv/lib/python3.11/site-packages/sqlalchemy/engine/create.py:661: in connect
return dialect.connect(*cargs, **cparams)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.venv/lib/python3.11/site-packages/sqlalchemy/engine/default.py:629: in connect
return self.loaded_dbapi.connect(*cargs, **cparams) # type: ignore[no-any-return] # NOQA: E501
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file
E (Background on this error at: https://sqlalche.me/e/20/e3q8)
____________________ ERROR at setup of test_small_interval_large_attempts_overflow__HABITAT ____________________
.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py:143: in __init__
self._dbapi_connection = engine.raw_connection()
^^^^^^^^^^^^^^^^^^^^^^^
.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py
return self.pool.connect()
^^^^^^^^^^^^^^^^^^^
...
.venv/lib/python3.11/site-packages/sqlalchemy/engine/default.py:629: in connect
return self.loaded_dbapi.connect(*cargs, **cparams) # type: ignore[no-any-return] # NOQA: E501
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file
E (Background on this error at: https://sqlalche.me/e/20/e3q8)
===================================== Warning summary. Total: 1, Unique: 1 =====================================
other: total 1, unique 1
runtest: total 1, unique 1
Warnings saved into /home/midohany910/habitat/airflow/devel-common/warnings.txt file.
=========================================== short test summary info ============================================
ERROR airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_overflow_caps_to_configured_max__HABITAT - sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file
ERROR airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_overflow_caps_to_internal_max__HABITAT - sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file
ERROR airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_non_exponential_policy_unchanged__HABITAT - sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file
ERROR airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_small_interval_large_attempts_overflow__HABITAT - sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) unable to open database file
========================================= 1 warning, 4 errors in 6.01s =========================================
Я выполняю эти шаги из корня клонированного репозитория Apache Airflow:
- Клонируйте репозиторий и навигация
git clone https://github.com/apache/airflow.git
cd airflow - Создайте и активируйте Venv (с помощью uv)
uv venv .venv
source .venv/bin/activate
# Win: .venv/Scripts/activate - Установить зависимости
uv pip install -e ./airflow-core[devel]
# Explicitly install testing tools that were missing
uv pip install pytest pytest-asyncio pyyaml - Установить переменные среды (пробовал различные версии, чтобы исправить ошибки импорта)
# Final working PYTHONPATH export
export PYTHONPATH="./airflow-core/src:./airflow-core/tests:.:$PYTHONPATH"
# Win: $env:PYTHONPATH = "$(Get-Location)\airflow-core\src;$(Get-Location)\airflow-core\tests;$(Get-Location)\devel-common\src"
# Added the test mode flag (often required for Airflow tests)
export AIRFLOWCOREUNIT_TEST_MODE="true" - Запустите тест
pytest -q airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py
Тестовый сеанс завершается неудачей в Windows, но запускается в Linux и пытается загрузить тестовый модуль, а затем завершается с вышеуказанной ошибкой.
Что я делаю неправильно?
Я обрезал ошибки для длины сообщения linmit — если вам нужна полная ошибка Я могу предоставить.
Обновление
Я обнаружил, что airflow должен записывать данные в базу данных в папке с именем airflow непосредственно под корневым каталогом вашего пользователя, поэтому просто создал его mkdir airflow.
Теперь он показывает еще одну ошибку:
(.venv) midohany910@habitat-os:~/habitat/airflow$ pytest -q airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py
============================================= test session starts ==============================================
platform linux -- Python 3.11.2, pytest-9.0.1, pluggy-1.6.0 -- /home/midohany910/habitat/airflow/.venv/bin/python3
cachedir: .pytest_cache
rootdir: /home/midohany910/habitat/airflow
configfile: pyproject.toml
plugins: asyncio-1.3.0, anyio-4.12.0, time-machine-3.1.0
asyncio: mode=Mode.STRICT, debug=False, asyncio_default_fixture_loop_scope=function, asyncio_default_test_loop_scope=function
collected 4 items
airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_overflow_caps_to_configured_max__HABITAT ERROR [ 25%]
airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_overflow_caps_to_internal_max__HABITAT ERROR [ 50%]
airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_non_exponential_policy_unchanged__HABITAT ERROR [ 75%]
airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_small_interval_large_attempts_overflow__HABITAT ERROR [100%]
==================================================== ERRORS ====================================================
_______________________ ERROR at setup of test_overflow_caps_to_configured_max__HABITAT ________________________
airflow-core/src/airflow/serialization/serialized_objects.py
serialized_dag["tasks"] = [cls.serialize(task) for _, task in dag.task_dict.items()]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
airflow-core/src/airflow/serialization/serialized_objects.py
serialized_dag["tasks"] = [cls.serialize(task) for _, task in dag.task_dict.items()]
^^^^^^^^^^^^^^^^^^^
...
airflow-core/src/airflow/serialization/serialized_objects.py
raise SerializationError(f"Failed to serialize DAG {dag.dag_id!r}: {e}")
E airflow.exceptions.SerializationError: Failed to serialize DAG 'example_task_mapping_second_order': 'partial_kwargs'
-------------------------------------------- Captured stdout setup ---------------------------------------------
========================= AIRFLOW ==========================
Home of the user: /home/midohany910
Airflow home /home/midohany910/airflow
Initializing the DB - first time after entering the container.
Initialization can be also forced by adding --with-db-init flag when running tests.
[2025-12-01T16:33:08.312+0000] {db.py:1146} INFO - Dropping Airflow tables that exist
[2025-12-01T16:33:08.469+0000] {migration.py:211} INFO - Context impl SQLiteImpl.
[2025-12-01T16:33:08.470+0000] {migration.py:214} INFO - Will assume non-transactional DDL.
[2025-12-01T16:33:08.474+0000] {db.py:1155} INFO - Dropped all Airflow tables
[2025-12-01T16:33:08.485+0000] {migration.py:211} INFO - Context impl SQLiteImpl.
[2025-12-01T16:33:08.486+0000] {migration.py:214} INFO - Will assume non-transactional DDL.
[2025-12-01T16:33:08.487+0000] {db.py:669} INFO - Creating Airflow database tables from the ORM
[2025-12-01T16:33:08.487+0000] {db.py:674} INFO - Creating context
[2025-12-01T16:33:08.488+0000] {db.py:676} INFO - Binding engine
[2025-12-01T16:33:08.489+0000] {db.py:678} INFO - Pool status: Pool size: 5 Connections in pool: 0 Current Overflow: -2 Current Checked out connections: 3
[2025-12-01T16:33:08.489+0000] {db.py:680} INFO - Creating metadata
[2025-12-01T16:33:08.790+0000] {db.py:683} INFO - Getting alembic config
[2025-12-01T16:33:08.793+0000] {db.py:690} INFO - Stamping migration head
[2025-12-01T16:33:08.795+0000] {migration.py:211} INFO - Context impl SQLiteImpl.
[2025-12-01T16:33:08.795+0000] {migration.py:214} INFO - Will assume non-transactional DDL.
[2025-12-01T16:33:08.825+0000] {migration.py:622} INFO - Running stamp_revision -> a169942745c2
[2025-12-01T16:33:08.831+0000] {db.py:693} INFO - Airflow database tables created
[2025-12-01T16:33:09.033+0000] {manager.py:179} INFO - DAG bundles loaded: dags-folder, example_dags
[2025-12-01T16:33:09.038+0000] {manager.py:222} INFO - Added new DAG bundle dags-folder to the database
[2025-12-01T16:33:09.039+0000] {manager.py:222} INFO - Added new DAG bundle example_dags to the database
[2025-12-01T16:33:09.045+0000] {dagbag.py:652} INFO - Filling up the DagBag from /home/midohany910/habitat/airflow/airflow-core/tests/unit/dags
[2025-12-01T16:33:09.154+0000] {test_task_view_type_check.py:52} INFO - class_instance type:
[2025-12-01T16:33:09.259+0000] {plugins_manager.py:306} ERROR - Failed to import plugin /home/midohany910/habitat/airflow/airflow-core/tests/unit/plugins/test_plugin.py
Traceback (most recent call last):
File "/home/midohany910/habitat/airflow/airflow-core/src/airflow/plugins_manager.py", line 299, in load_plugins_from_plugin_directory
loader.exec_module(mod)
File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "/home/midohany910/habitat/airflow/airflow-core/tests/unit/plugins/test_plugin.py", line 21, in
from flask import Blueprint
ModuleNotFoundError: No module named 'flask'
[2025-12-01T16:33:09.264+0000] {workday.py:41} WARNING - Could not import pandas. Holidays will not be considered.
[2025-12-01T16:33:09.379+0000] {example_kubernetes_executor.py:38} WARNING - The example_kubernetes_executor example DAG requires the kubernetes provider. Please install it with: pip install apache-airflow[cncf.kubernetes]
[2025-12-01T16:33:09.446+0000] {example_local_kubernetes_executor.py:39} WARNING - Could not import DAGs in example_local_kubernetes_executor.py
Traceback (most recent call last):
File "/home/midohany910/habitat/airflow/airflow-core/src/airflow/example_dags/example_local_kubernetes_executor.py", line 37, in
from kubernetes.client import models as k8s
ModuleNotFoundError: No module named 'kubernetes'
[2025-12-01T16:33:09.447+0000] {example_local_kubernetes_executor.py:40} WARNING - Install Kubernetes dependencies with: pip install apache-airflow[cncf.kubernetes]
[2025-12-01T16:33:09.571+0000] {workday.py:41} WARNING - Could not import pandas. Holidays will not be considered.
---------------------------------------------- Captured log setup ----------------------------------------------
INFO airflow.utils.db:db.py:1146 Dropping Airflow tables that exist
INFO alembic.runtime.migration:migration.py:211 Context impl SQLiteImpl.
INFO alembic.runtime.migration:migration.py:214 Will assume non-transactional DDL.
INFO airflow.utils.db:db.py:1155 Dropped all Airflow tables
INFO alembic.runtime.migration:migration.py:211 Context impl SQLiteImpl.
INFO alembic.runtime.migration:migration.py:214 Will assume non-transactional DDL.
INFO airflow.utils.db:db.py:669 Creating Airflow database tables from the ORM
INFO airflow.utils.db:db.py:674 Creating context
INFO airflow.utils.db:db.py:676 Binding engine
INFO airflow.utils.db:db.py:678 Pool status: Pool size: 5 Connections in pool: 0 Current Overflow: -2 Current Checked out connections: 3
INFO airflow.utils.db:db.py:680 Creating metadata
INFO airflow.utils.db:db.py:683 Getting alembic config
INFO airflow.utils.db:db.py:690 Stamping migration head
INFO alembic.runtime.migration:migration.py:211 Context impl SQLiteImpl.
INFO alembic.runtime.migration:migration.py:214 Will assume non-transactional DDL.
INFO alembic.runtime.migration:migration.py:622 Running stamp_revision -> a169942745c2
INFO airflow.utils.db:db.py:693 Airflow database tables created
INFO airflow.dag_processing.bundles.manager.DagBundlesManager:manager.py:179 DAG bundles loaded: dags-folder, example_dags
INFO airflow.dag_processing.bundles.manager.DagBundlesManager:manager.py:222 Added new DAG bundle dags-folder to the database
INFO airflow.dag_processing.bundles.manager.DagBundlesManager:manager.py:222 Added new DAG bundle example_dags to the database
INFO airflow.models.dagbag.DagBag:dagbag.py:652 Filling up the DagBag from /home/midohany910/habitat/airflow/airflow-core/tests/unit/dags
INFO unusual_prefix_c145e7739885a81884f90b68f6b03b5b9138a5f5_test_task_view_type_check:test_task_view_type_check.py:52 class_instance type:
ERROR airflow.plugins_manager:plugins_manager.py:306 Failed to import plugin /home/midohany910/habitat/airflow/airflow-core/tests/unit/plugins/test_plugin.py
Traceback (most recent call last):
File "/home/midohany910/habitat/airflow/airflow-core/src/airflow/plugins_manager.py", line 299, in load_plugins_from_plugin_directory
loader.exec_module(mod)
File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "/home/midohany910/habitat/airflow/airflow-core/tests/unit/plugins/test_plugin.py", line 21, in
from flask import Blueprint
ModuleNotFoundError: No module named 'flask'
WARNING airflow.example_dags.plugins.workday:workday.py:41 Could not import pandas. Holidays will not be considered.
WARNING unusual_prefix_d24414d92eb79036310ed52f03a9d15c3e418d63_example_kubernetes_executor:example_kubernetes_executor.py:38 The example_kubernetes_executor example DAG requires the kubernetes provider. Please install it with: pip install apache-airflow[cncf.kubernetes]
WARNING unusual_prefix_1ca60aba5cb140f8d81ab94ef647dec284be408c_example_local_kubernetes_executor:example_local_kubernetes_executor.py:39 Could not import DAGs in example_local_kubernetes_executor.py
...
airflow-core/src/airflow/models/dagbag.py:729: in sync_to_db
dags = [
airflow-core/src/airflow/models/dagbag.py:732: in
else LazyDeserializedDAG(data=SerializedDAG.to_dict(dag))
^^^^^^^^^^^^^^^^^^^^^^^^^^
airflow-core/src/airflow/serialization/serialized_objects.py
json_dict = {"__version": cls.SERIALIZER_VERSION, "dag": cls.serialize_dag(var)}
^^^^^^^^^^^^^^^^^^^^^^
airflow-core/src/airflow/serialization/serialized_objects.py
raise SerializationError(f"Failed to serialize DAG {dag.dag_id!r}: {e}")
E airflow.exceptions.SerializationError: Failed to serialize DAG 'example_task_mapping_second_order': 'partial_kwargs'
===================================== Warning summary. Total: 3, Unique: 3 =====================================
other: total 1, unique 1
runtest: total 1, unique 1
tests: total 2, unique 2
runtest: total 2, unique 2
Warnings saved into /home/midohany910/habitat/airflow/devel-common/warnings.txt file.
=========================================== short test summary info ============================================
ERROR airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_overflow_caps_to_configured_max__HABITAT - airflow.exceptions.SerializationError: Failed to serialize DAG 'example_task_mapping_second_order': 'partia...
ERROR airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_overflow_caps_to_internal_max__HABITAT - airflow.exceptions.SerializationError: Failed to serialize DAG 'example_task_mapping_second_order': 'partia...
ERROR airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_non_exponential_policy_unchanged__HABITAT - airflow.exceptions.SerializationError: Failed to serialize DAG 'example_task_mapping_second_order': 'partia...
ERROR airflow-core/tests/unit/models/test_safe_exponential_backoff__HABITAT.py::test_small_interval_large_attempts_overflow__HABITAT - airflow.exceptions.SerializationError: Failed to serialize DAG 'example_task_mapping_second_order': 'partia...
========================================= 1 warning, 4 errors in 5.89s =========================================
Подробнее здесь: https://stackoverflow.com/questions/798 ... he-airflow
Мобильная версия