astro-cli
astro-cli copied to clipboard
Deployment failure due to pytest failure for dags using imports like DagRun
while deployment astrocloud-cli runs .astrocloud/test_dag_integrity_default.py which inturn checks for import error. But for DAGs using imports such as:
from airflow.models import XCom, DagRun
fails with the following error:
a7b435-test-1 | =================================== FAILURES ===================================
a7b435-test-1 | ____________________ test_file_imports[dags/dag_catalog.py] ____________________
a7b435-test-1 |
a7b435-test-1 | rel_path = 'dags/dag_catalog.py'
a7b435-test-1 | rv = 'Traceback (most recent call last):\n File "/usr/local/lib/python3.9/site-packages/sqlalchemy/engine/base.py", line 1..._run.dag_id = ?]\n[parameters: (\'dag_fetch_products\',)]\n(Background on this error at: http://sqlalche.me/e/14/e3q8)'
a7b435-test-1 |
a7b435-test-1 | @pytest.mark.parametrize("rel_path,rv", get_import_errors(), ids=[x[0] for x in get_import_errors()])
a7b435-test-1 | def test_file_imports(rel_path,rv):
a7b435-test-1 | """ Test for import errors on a file """
a7b435-test-1 | if rel_path and rv : #Make sure our no op test doesn't raise an error
a7b435-test-1 | > raise Exception(f"{rel_path} failed to import with message \n {rv}")
a7b435-test-1 | E Exception: dags/dag_catalog.py failed to import with message
a7b435-test-1 | E Traceback (most recent call last):
a7b435-test-1 | E File "/usr/local/lib/python3.9/site-packages/sqlalchemy/engine/base.py", line 1705, in _execute_context
a7b435-test-1 | E self.dialect.do_execute(
a7b435-test-1 | E File "/usr/local/lib/python3.9/site-packages/sqlalchemy/engine/default.py", line 716, in do_execute
a7b435-test-1 | E cursor.execute(statement, parameters)
a7b435-test-1 | E sqlite3.OperationalError: no such table: dag_run
a7b435-test-1 | E
a7b435-test-1 | E The above exception was the direct cause of the following exception:
a7b435-test-1 | E
a7b435-test-1 | E Traceback (most recent call last):
a7b435-test-1 | E File "/usr/local/lib/python3.9/site-packages/sqlalchemy/engine/base.py", line 1705, in _execute_context
a7b435-test-1 | E self.dialect.do_execute(
a7b435-test-1 | E File "/usr/local/lib/python3.9/site-packages/sqlalchemy/engine/default.py", line 716, in do_execute
a7b435-test-1 | E cursor.execute(statement, parameters)
a7b435-test-1 | E sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such table: dag_run
a7b435-test-1 | E [SQL: SELECT max(dag_run.execution_date) AS max_1
a7b435-test-1 | E FROM dag_run
a7b435-test-1 | E WHERE dag_run.dag_id = ?]
a7b435-test-1 | E [parameters: ('dag_fetch_products',)]
a7b435-test-1 | E (Background on this error at: http://sqlalche.me/e/14/e3q8)
a7b435-test-1 |
a7b435-test-1 | .astrocloud/test_dag_integrity_default.py:89: Exception
Can we add a pytest fixture for database setup ? Even if it is with sqlite3.
Suggesting customer to do force deployments at this stage.
original ticket - https://github.com/astronomer/cloud-cli/issues/287