airflow-client-python icon indicating copy to clipboard operation
airflow-client-python copied to clipboard

get_dag_details api is broken

Open er1shivam opened this issue 4 years ago • 3 comments

Client version : 2.1.0

Code :

    api_instance = dag_api.DAGApi(api_client)
    dag_id = "dag_id_example"
    try:
        # Get a simplified representation of DAG
        api_response = api_instance.get_dag_details(dag_id)
        pprint(api_response)
    except client.ApiException as e:
        print("Exception when calling DAGApi->get_dag_details: %s\n" % e)

Error:

ApiValueError: Invalid inputs given to generate an instance of 'DAGDetailAllOf. The input data was invalid for the allOf schema 'DAGDetailAllOf' in the composed schema 'DAGDetail'. Error=Invalid type for variable 'dag_run_timeout'. Required value type is TimeDelta and passed type was NoneType at ['received_data']['dag_run_timeout']

er1shivam avatar Aug 03 '21 03:08 er1shivam

Looking at the error message, it seems the problem is from your DAG, dagrun_timeout should be a timedelta. It seems you have it misconfigured. Can you share the DAG?

ephraimbuddy avatar Aug 03 '21 07:08 ephraimbuddy

I will share it in few hours when I log back in. But as you said regarding dag_run_timeout, I haven't defined this in my dag as it is not a REQUIRED parameter. So I believe python api should be able to handle NoneType if this is not defined in the DAG. Also just wanted to point out that REST API successfully retrieve the details of the same DAG.

er1shivam avatar Aug 04 '21 03:08 er1shivam

I am using this DAG from the example on official Airflow website. Please visit this link...


from datetime import timedelta
from textwrap import dedent

# The DAG object; we'll need this to instantiate a DAG
from airflow import DAG

# Operators; we need this to operate!
from airflow.operators.bash import BashOperator
from airflow.utils.dates import days_ago
# These args will get passed on to each operator
# You can override them on a per-task basis during operator initialization
default_args = {
    'owner': 'airflow',
    'depends_on_past': False,
    'email': ['[email protected]'],
    'email_on_failure': False,
    'email_on_retry': False,
    'retries': 1,
    'retry_delay': timedelta(minutes=5),
    # 'queue': 'bash_queue',
    # 'pool': 'backfill',
    # 'priority_weight': 10,
    # 'end_date': datetime(2016, 1, 1),
    # 'wait_for_downstream': False,
    # 'dag': dag,
    # 'sla': timedelta(hours=2),
    # 'execution_timeout': timedelta(seconds=300),
    # 'on_failure_callback': some_function,
    # 'on_success_callback': some_other_function,
    # 'on_retry_callback': another_function,
    # 'sla_miss_callback': yet_another_function,
    # 'trigger_rule': 'all_success'
}
with DAG(
    'tutorial',
    default_args=default_args,
    description='A simple tutorial DAG',
    schedule_interval=timedelta(days=1),
    start_date=days_ago(2),
    tags=['example'],
) as dag:

    # t1, t2 and t3 are examples of tasks created by instantiating operators
    t1 = BashOperator(
        task_id='print_date',
        bash_command='date',
    )

    t2 = BashOperator(
        task_id='sleep',
        depends_on_past=False,
        bash_command='sleep 5',
        retries=3,
    )
    t1.doc_md = dedent(
        """\
    #### Task Documentation
    You can document your task using the attributes `doc_md` (markdown),
    `doc` (plain text), `doc_rst`, `doc_json`, `doc_yaml` which gets
    rendered in the UI's Task Instance Details page.
    ![img](http://montcs.bloomu.edu/~bobmon/Semesters/2012-01/491/import%20soul.png)

    """
    )

    dag.doc_md = __doc__  # providing that you have a docstring at the beggining of the DAG
    dag.doc_md = """
    This is a documentation placed anywhere
    """  # otherwise, type it like this
    templated_command = dedent(
        """
    {% for i in range(5) %}
        echo "{{ ds }}"
        echo "{{ macros.ds_add(ds, 7)}}"
        echo "{{ params.my_param }}"
    {% endfor %}
    """
    )

    t3 = BashOperator(
        task_id='templated',
        depends_on_past=False,
        bash_command=templated_command,
        params={'my_param': 'Parameter I passed in'},
    )

    t1 >> [t2, t3]


er1shivam avatar Aug 04 '21 16:08 er1shivam

Closing, solved in 2.7.0 same problem originating from wrong usage of the nullable property next to a $ref object.

Feel free to re-open if needed.

Release candidate available here for testing purpose: https://pypi.org/project/apache-airflow-client/2.7.0rc1/

Stable 2.7.0 client will be released shortly.

pierrejeambrun avatar Aug 24 '23 12:08 pierrejeambrun