Skip to content

on_failure_callback not triggered when DatabricksRunNowOperator fails #54023

@maroskooh

Description

@maroskooh

Apache Airflow Provider(s)

databricks

Versions of Apache Airflow Providers

apache-airflow-providers-databricks==7.6.0

Apache Airflow version

3.0.3

Operating System

Debian GNU/Linux 12

Deployment

Docker-Compose

Deployment details

apache-airflow-providers-databricks==7.6.0

What happened

When using the DatabricksRunNowOperator, if the provided job_id does not exist in Databricks, the operator fails (Databricks returns a 400 error and an AirflowException is raised), and the task is correctly marked as failed in the UI.

However, the task’s on_failure_callback is not triggered, even though the task failed during normal operator execution.

Same happens if I provide wrong connection name.

What you think should happen instead

The on_failure_callback should be triggered on any task failure that occurs during execution, including remote API failures from Databricks (such as invalid job ID). This is consistent with the documented behavior of callbacks in Airflow.

How to reproduce

Airflow version: 3.0.3, Databricks provider version: apache-airflow-providers-databricks==7.6.0
Create a DAG with a DatabricksRunNowOperator task.
Set the job_name parameter to a non-existent job in your Databricks workspace OR non-existing connection.
Define a valid on_failure_callback function and pass it directly to the operator using the on_failure_callback argument.
Trigger the DAG manually or via schedule.

Anything else

Received logs:
[2025-07-31, 20:10:09] INFO - Connection Retrieved 'DBX_DEV': source="airflow.hooks.base"
[2025-07-31, 20:10:09] INFO - Using basic auth.: source="airflow.task.hooks.airflow.providers.databricks.hooks.databricks.DatabricksHook"
[2025-07-31, 20:10:09] ERROR - Task failed with exception: source="task"
AirflowException: Job ID for job name DEV_logging_ can not be found
File "/usr/local/lib/python3.12/site-packages/airflow/sdk/execution_time/task_runner.py", line 877 in run
File "/usr/local/lib/python3.12/site-packages/airflow/sdk/execution_time/task_runner.py", line 1164 in _execute_task
File "/usr/local/lib/python3.12/site-packages/airflow/sdk/bases/operator.py", line 397 in wrapper
File "/usr/local/lib/python3.12/site-packages/airflow/providers/databricks/operators/databricks.py", line 936 in execute
[2025-07-31, 20:10:09] WARNING - No XCom value found; defaulting to None.: key="run_page_url": dag_id="dbx_test": task_id="run_now": run_id="scheduled__2025-07-31T18:10:00+00:00": map_index=-1: source="task"
[2025-07-31, 20:10:09] ERROR - Top level error: source="task"
AirflowRuntimeError: API_SERVER_ERROR: {'status_code': 422, 'message': 'Remote server returned validation error', 'detail': {'detail': [{'type': 'missing', 'loc': ['body'], 'msg': 'Field required', 'input': None}]}}
File "/usr/local/lib/python3.12/site-packages/airflow/sdk/execution_time/task_runner.py", line 1302 in main
File "/usr/local/lib/python3.12/site-packages/airflow/sdk/execution_time/task_runner.py", line 1242 in finalize
File "/usr/local/lib/python3.12/site-packages/airflow/sdk/execution_time/task_runner.py", line 546 in _xcom_push_to_db
File "/usr/local/lib/python3.12/site-packages/airflow/sdk/bases/xcom.py", line 116 in _set_xcom_in_db
File "/usr/local/lib/python3.12/site-packages/airflow/sdk/execution_time/comms.py", line 203 in send
File "/usr/local/lib/python3.12/site-packages/airflow/sdk/execution_time/comms.py", line 263 in _get_response
File "/usr/local/lib/python3.12/site-packages/airflow/sdk/execution_time/comms.py", line 250 in _from_frame
[2025-07-31, 20:10:09] WARNING - Process exited abnormally: exit_code=1: source="task

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions