Skip to content

apache/airflow-client-python

Apache Airflow Python Client

Overview

To facilitate management, Apache Airflow supports a range of REST API endpoints across its objects. This section provides an overview of the API design, methods, and supported use cases.

Most of the endpoints accept JSON as input and return JSON responses. This means that you must usually add the following headers to your request:

Content-type: application/json
Accept: application/json

Resources

The term resource refers to a single type of object in the Airflow metadata. An API is broken up by its endpoint's corresponding resource. The name of a resource is typically plural and expressed in camelCase. Example: dagRuns.

Resource names are used as part of endpoint URLs, as well as in API parameters and responses.

CRUD Operations

The platform supports Create, Read, Update, and Delete operations on most resources. You can review the standards for these operations and their standard parameters below.

Some endpoints have special behavior as exceptions.

Create

To create a resource, you typically submit an HTTP POST request with the resource's required metadata in the request body. The response returns a 201 Created response code upon success with the resource's metadata, including its internal id, in the response body.

Read

The HTTP GET request can be used to read a resource or to list a number of resources.

A resource's id can be submitted in the request parameters to read a specific resource. The response usually returns a 200 OK response code upon success, with the resource's metadata in the response body.

If a GET request does not include a specific resource id, it is treated as a list request. The response usually returns a 200 OK response code upon success, with an object containing a list of resources' metadata in the response body.

When reading resources, some common query parameters are usually available. e.g.:

/api/v2/connections?limit=25&offset=25
Query Parameter Type Description
limit integer Maximum number of objects to fetch. Usually 25 by default
offset integer Offset after which to start returning objects. For use with limit query parameter.

Update

Updating a resource requires the resource id, and is typically done using an HTTP PATCH request, with the fields to modify in the request body. The response usually returns a 200 OK response code upon success, with information about the modified resource in the response body.

Delete

Deleting a resource requires the resource id and is typically executing via an HTTP DELETE request. The response usually returns a 204 No Content response code upon success.

Conventions

  • Resource names are plural and expressed in camelCase.

  • Names are consistent between URL parameter name and field name.

  • Field names are in snake_case.

{
    \"name\": \"string\",
    \"slots\": 0,
    \"occupied_slots\": 0,
    \"used_slots\": 0,
    \"queued_slots\": 0,
    \"open_slots\": 0
}

Update Mask

Update mask is available as a query parameter in patch endpoints. It is used to notify the API which fields you want to update. Using update_mask makes it easier to update objects by helping the server know which fields to update in an object instead of updating all fields. The update request ignores any fields that aren't specified in the field mask, leaving them with their current values.

Example:

import requests

resource = requests.get("/resource/my-id").json()
resource["my_field"] = "new-value"
requests.patch("/resource/my-id?update_mask=my_field", data=json.dumps(resource))

Versioning and Endpoint Lifecycle

  • API versioning is not synchronized to specific releases of the Apache Airflow.
  • APIs are designed to be backward compatible.
  • Any changes to the API will first go through a deprecation phase.

Trying the API

You can use a third party client, such as curl, HTTPie, Postman or the Insomnia rest client to test the Apache Airflow API.

Note that you will need to pass authentication credentials. If your Airflow deployment supports Bearer token authentication, you can use the following example:

For example, here is how to pause a DAG with curl, using a Bearer token:

curl -X PATCH 'https://example.com/api/v2/dags/{dag_id}?update_mask=is_paused' \
  -H 'Content-Type: application/json' \
  -H 'Authorization: Bearer YOUR_ACCESS_TOKEN' \
  -d '{
      \"is_paused\": true
  }'

Using a graphical tool such as Postman or Insomnia, it is possible to import the API specifications directly:

  1. Download the API specification by clicking the Download button at top of this document.
  2. Import the JSON specification in the graphical tool of your choice.
  • In Postman, you can click the import button at the top
  • With Insomnia, you can just drag-and-drop the file on the UI

Note that with Postman, you can also generate code snippets by selecting a request and clicking on the Code button.

Enabling CORS

Cross-origin resource sharing (CORS) is a browser security feature that restricts HTTP requests that are initiated from scripts running in the browser.

For details on enabling/configuring CORS, see Enabling CORS.

Authentication

To be able to meet the requirements of many organizations, Airflow supports many authentication methods, and it is even possible to add your own method.

If you want to check which auth backend is currently set, you can use airflow config get-value api auth_backends command as in the example below.

$ airflow config get-value api auth_backends
airflow.providers.fab.auth_manager.api.auth.backend.basic_auth

The default is to deny all requests.

For details on configuring the authentication, see API Authorization.

Errors

We follow the error response format proposed in RFC 7807 also known as Problem Details for HTTP APIs. As with our normal API responses, your client must be prepared to gracefully handle additional members of the response.

Unauthenticated

This indicates that the request has not been applied because it lacks valid authentication credentials for the target resource. Please check that you have valid credentials.

PermissionDenied

This response means that the server understood the request but refuses to authorize it because it lacks sufficient rights to the resource. It happens when you do not have the necessary permission to execute the action you performed. You need to get the appropriate permissions in other to resolve this error.

BadRequest

This response means that the server cannot or will not process the request due to something that is perceived to be a client error (e.g., malformed request syntax, invalid request message framing, or deceptive request routing). To resolve this, please ensure that your syntax is correct.

NotFound

This client error response indicates that the server cannot find the requested resource.

MethodNotAllowed

Indicates that the request method is known by the server but is not supported by the target resource.

NotAcceptable

The target resource does not have a current representation that would be acceptable to the user agent, according to the proactive negotiation header fields received in the request, and the server is unwilling to supply a default representation.

AlreadyExists

The request could not be completed due to a conflict with the current state of the target resource, e.g. the resource it tries to create already exists.

Unknown

This means that the server encountered an unexpected condition that prevented it from fulfilling the request.

This Python package is automatically generated by the OpenAPI Generator project:

  • API version: 2.9.0
  • Package version: 2.9.0
  • Build package: org.openapitools.codegen.languages.PythonClientCodegen

For more information, please visit https://airflow.apache.org

Requirements.

Python >=3.9

Installation & Usage

pip install

You can install the client using standard Python installation tools. It is hosted in PyPI with apache-airflow-client package id so the easiest way to get the latest version is to run:

pip install apache-airflow-client

If the python package is hosted on a repository, you can install directly using:

pip install git+https://github.com/apache/airflow-client-python.git

Import check

Then import the package:

import airflow_client.client

Getting Started

Please follow the installation procedure and then run the following:

import airflow_client.client
from airflow_client.client.rest import ApiException
from pprint import pprint

# Defining the host is optional and defaults to http://localhost
# See configuration.py for a list of all supported configuration parameters.
configuration = airflow_client.client.Configuration(host="http://localhost")

# The client must configure the authentication and authorization parameters
# in accordance with the API server security policy.
# Examples for each auth method are provided below, use the example that
# satisfies your auth use case.

configuration.access_token = os.environ["ACCESS_TOKEN"]


# Enter a context with an instance of the API client
with airflow_client.client.ApiClient(configuration) as api_client:
    # Create an instance of the API class
    api_instance = airflow_client.client.AssetApi(api_client)
    create_asset_events_body = airflow_client.client.CreateAssetEventsBody()  # CreateAssetEventsBody |

    try:
        # Create Asset Event
        api_response = api_instance.create_asset_event(create_asset_events_body)
        print("The response of AssetApi->create_asset_event:\n")
        pprint(api_response)
    except ApiException as e:
        print("Exception when calling AssetApi->create_asset_event: %s\n" % e)

Documentation for API Endpoints

All URIs are relative to http://localhost

Class Method HTTP request Description
AssetApi create_asset_event POST /api/v2/assets/events Create Asset Event
AssetApi delete_asset_queued_events DELETE /api/v2/assets/{asset_id}/queuedEvents Delete Asset Queued Events
AssetApi delete_dag_asset_queued_event DELETE /api/v2/dags/{dag_id}/assets/{asset_id}/queuedEvents Delete Dag Asset Queued Event
AssetApi delete_dag_asset_queued_events DELETE /api/v2/dags/{dag_id}/assets/queuedEvents Delete Dag Asset Queued Events
AssetApi get_asset GET /api/v2/assets/{asset_id} Get Asset
AssetApi get_asset_alias GET /api/v2/assets/aliases/{asset_alias_id} Get Asset Alias
AssetApi get_asset_aliases GET /api/v2/assets/aliases Get Asset Aliases
AssetApi get_asset_events GET /api/v2/assets/events Get Asset Events
AssetApi get_asset_queued_events GET /api/v2/assets/{asset_id}/queuedEvents Get Asset Queued Events
AssetApi get_assets GET /api/v2/assets Get Assets
AssetApi get_dag_asset_queued_event GET /api/v2/dags/{dag_id}/assets/{asset_id}/queuedEvents Get Dag Asset Queued Event
AssetApi get_dag_asset_queued_events GET /api/v2/dags/{dag_id}/assets/queuedEvents Get Dag Asset Queued Events
AssetApi materialize_asset POST /api/v2/assets/{asset_id}/materialize Materialize Asset
BackfillApi cancel_backfill PUT /api/v2/backfills/{backfill_id}/cancel Cancel Backfill
BackfillApi create_backfill POST /api/v2/backfills Create Backfill
BackfillApi create_backfill_dry_run POST /api/v2/backfills/dry_run Create Backfill Dry Run
BackfillApi get_backfill GET /api/v2/backfills/{backfill_id} Get Backfill
BackfillApi list_backfills GET /api/v2/backfills List Backfills
BackfillApi pause_backfill PUT /api/v2/backfills/{backfill_id}/pause Pause Backfill
BackfillApi unpause_backfill PUT /api/v2/backfills/{backfill_id}/unpause Unpause Backfill
ConfigApi get_config GET /api/v2/config Get Config
ConfigApi get_config_value GET /api/v2/config/section/{section}/option/{option} Get Config Value
ConnectionApi bulk_connections PATCH /api/v2/connections Bulk Connections
ConnectionApi create_default_connections POST /api/v2/connections/defaults Create Default Connections
ConnectionApi delete_connection DELETE /api/v2/connections/{connection_id} Delete Connection
ConnectionApi get_connection GET /api/v2/connections/{connection_id} Get Connection
ConnectionApi get_connections GET /api/v2/connections Get Connections
ConnectionApi patch_connection PATCH /api/v2/connections/{connection_id} Patch Connection
ConnectionApi post_connection POST /api/v2/connections Post Connection
ConnectionApi test_connection POST /api/v2/connections/test Test Connection
DAGApi delete_dag DELETE /api/v2/dags/{dag_id} Delete Dag
DAGApi get_dag GET /api/v2/dags/{dag_id} Get Dag
DAGApi get_dag_details GET /api/v2/dags/{dag_id}/details Get Dag Details
DAGApi get_dag_tags GET /api/v2/dagTags Get Dag Tags
DAGApi get_dags GET /api/v2/dags Get Dags
DAGApi patch_dag PATCH /api/v2/dags/{dag_id} Patch Dag
DAGApi patch_dags PATCH /api/v2/dags Patch Dags
DAGParsingApi reparse_dag_file PUT /api/v2/parseDagFile/{file_token} Reparse Dag File
DagReportApi get_dag_reports GET /api/v2/dagReports Get Dag Reports
DagRunApi clear_dag_run POST /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/clear Clear Dag Run
DagRunApi delete_dag_run DELETE /api/v2/dags/{dag_id}/dagRuns/{dag_run_id} Delete Dag Run
DagRunApi get_dag_run GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id} Get Dag Run
DagRunApi get_dag_runs GET /api/v2/dags/{dag_id}/dagRuns Get Dag Runs
DagRunApi get_list_dag_runs_batch POST /api/v2/dags/{dag_id}/dagRuns/list Get List Dag Runs Batch
DagRunApi get_upstream_asset_events GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/upstreamAssetEvents Get Upstream Asset Events
DagRunApi patch_dag_run PATCH /api/v2/dags/{dag_id}/dagRuns/{dag_run_id} Patch Dag Run
DagRunApi trigger_dag_run POST /api/v2/dags/{dag_id}/dagRuns Trigger Dag Run
DagSourceApi get_dag_source GET /api/v2/dagSources/{dag_id} Get Dag Source
DagStatsApi get_dag_stats GET /api/v2/dagStats Get Dag Stats
DagVersionApi get_dag_version GET /api/v2/dags/{dag_id}/dagVersions/{version_number} Get Dag Version
DagVersionApi get_dag_versions GET /api/v2/dags/{dag_id}/dagVersions Get Dag Versions
DagWarningApi list_dag_warnings GET /api/v2/dagWarnings List Dag Warnings
EventLogApi get_event_log GET /api/v2/eventLogs/{event_log_id} Get Event Log
EventLogApi get_event_logs GET /api/v2/eventLogs Get Event Logs
ExtraLinksApi get_extra_links GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/links Get Extra Links
ImportErrorApi get_import_error GET /api/v2/importErrors/{import_error_id} Get Import Error
ImportErrorApi get_import_errors GET /api/v2/importErrors Get Import Errors
JobApi get_jobs GET /api/v2/jobs Get Jobs
LoginApi login GET /api/v2/auth/login Login
LoginApi logout GET /api/v2/auth/logout Logout
MonitorApi get_health GET /api/v2/monitor/health Get Health
PluginApi get_plugins GET /api/v2/plugins Get Plugins
PoolApi bulk_pools PATCH /api/v2/pools Bulk Pools
PoolApi delete_pool DELETE /api/v2/pools/{pool_name} Delete Pool
PoolApi get_pool GET /api/v2/pools/{pool_name} Get Pool
PoolApi get_pools GET /api/v2/pools Get Pools
PoolApi patch_pool PATCH /api/v2/pools/{pool_name} Patch Pool
PoolApi post_pool POST /api/v2/pools Post Pool
ProviderApi get_providers GET /api/v2/providers Get Providers
TaskApi get_task GET /api/v2/dags/{dag_id}/tasks/{task_id} Get Task
TaskApi get_tasks GET /api/v2/dags/{dag_id}/tasks Get Tasks
TaskInstanceApi get_extra_links GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/links Get Extra Links
TaskInstanceApi get_log GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/logs/{try_number} Get Log
TaskInstanceApi get_mapped_task_instance GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index} Get Mapped Task Instance
TaskInstanceApi get_mapped_task_instance_tries GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index}/tries Get Mapped Task Instance Tries
TaskInstanceApi get_mapped_task_instance_try_details GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index}/tries/{task_try_number} Get Mapped Task Instance Try Details
TaskInstanceApi get_mapped_task_instances GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/listMapped Get Mapped Task Instances
TaskInstanceApi get_task_instance GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id} Get Task Instance
TaskInstanceApi get_task_instance_dependencies GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/dependencies Get Task Instance Dependencies
TaskInstanceApi get_task_instance_dependencies_by_map_index GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index}/dependencies Get Task Instance Dependencies
TaskInstanceApi get_task_instance_tries GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/tries Get Task Instance Tries
TaskInstanceApi get_task_instance_try_details GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/tries/{task_try_number} Get Task Instance Try Details
TaskInstanceApi get_task_instances GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances Get Task Instances
TaskInstanceApi get_task_instances_batch POST /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/list Get Task Instances Batch
TaskInstanceApi patch_task_instance PATCH /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id} Patch Task Instance
TaskInstanceApi patch_task_instance_by_map_index PATCH /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index} Patch Task Instance
TaskInstanceApi patch_task_instance_dry_run PATCH /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/dry_run Patch Task Instance Dry Run
TaskInstanceApi patch_task_instance_dry_run_by_map_index PATCH /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/{map_index}/dry_run Patch Task Instance Dry Run
TaskInstanceApi post_clear_task_instances POST /api/v2/dags/{dag_id}/clearTaskInstances Post Clear Task Instances
VariableApi bulk_variables PATCH /api/v2/variables Bulk Variables
VariableApi delete_variable DELETE /api/v2/variables/{variable_key} Delete Variable
VariableApi get_variable GET /api/v2/variables/{variable_key} Get Variable
VariableApi get_variables GET /api/v2/variables Get Variables
VariableApi patch_variable PATCH /api/v2/variables/{variable_key} Patch Variable
VariableApi post_variable POST /api/v2/variables Post Variable
VersionApi get_version GET /api/v2/version Get Version
XComApi create_xcom_entry POST /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries Create Xcom Entry
XComApi get_xcom_entries GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries Get Xcom Entries
XComApi get_xcom_entry GET /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries/{xcom_key} Get Xcom Entry
XComApi update_xcom_entry PATCH /api/v2/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries/{xcom_key} Update Xcom Entry

Documentation For Models

Documentation For Authorization

By default the generated client supports the three authentication schemes:

  • Basic
  • GoogleOpenID
  • Kerberos
  • OAuth2PasswordBearer

However, you can generate client and documentation with your own schemes by adding your own schemes in the security section of the OpenAPI specification. You can do it with Breeze CLI by adding the --security-schemes option to the breeze release-management prepare-python-client command.

Basic "smoke" tests

You can run basic smoke tests to check if the client is working properly - we have a simple test script that uses the API to run the tests. To do that, you need to:

  • install the apache-airflow-client package as described above
  • install rich Python package
  • download the test_python_client.py file
  • make sure you have test airflow installation running. Do not experiment with your production deployment
  • configure your airflow webserver to enable basic authentication In the [api] section of your airflow.cfg set:
[api]
auth_backend = airflow.providers.fab.auth_manager.api.auth.backend.session,airflow.providers.fab.auth_manager.api.auth.backend.basic_auth

You can also set it by env variable: export AIRFLOW__API__AUTH_BACKENDS=airflow.providers.fab.auth_manager.api.auth.backend.session,airflow.providers.fab.auth_manager.api.auth.backend.basic_auth

  • configure your airflow webserver to load example dags In the [core] section of your airflow.cfg set:
[core]
load_examples = True

You can also set it by env variable: export AIRFLOW__CORE__LOAD_EXAMPLES=True

  • optionally expose configuration (NOTE! that this is dangerous setting). The script will happily run with the default setting, but if you want to see the configuration, you need to expose it. In the [webserver] section of your airflow.cfg set:
[api]
expose_config = True

You can also set it by env variable: export AIRFLOW__API__EXPOSE_CONFIG=True

  • Configure your host/ip/user/password in the test_python_client.py file
import airflow_client

# get the access token from Airflow API Server via /auth/token
configuration = airflow_client.client.Configuration(host="http://localhost:8080", access_token=access_token)
  • Run scheduler (or dag file processor you have setup with standalone dag file processor) for few parsing loops (you can pass --num-runs parameter to it or keep it running in the background). The script relies on example DAGs being serialized to the DB and this only happens when scheduler runs with core/load_examples set to True.

  • Run webserver - reachable at the host/port for the test script you want to run. Make sure it had enough time to initialize.

Run python test_python_client.py and you should see colored output showing attempts to connect and status.

Notes for Large OpenAPI documents

If the OpenAPI document is large, imports in client.apis and client.models may fail with a RecursionError indicating the maximum recursion limit has been exceeded. In that case, there are a couple of solutions:

Solution 1: Use specific imports for apis and models like:

  • from airflow_client.client.api.default_api import DefaultApi
  • from airflow_client.client.model.pet import Pet

Solution 2: Before importing the package, adjust the maximum recursion limit as shown below:

import sys

sys.setrecursionlimit(1500)
import airflow_client.client
from airflow_client.client.api import *
from airflow_client.client.models import *

Authors

dev@airflow.apache.org

Packages

No packages published

Contributors 19

Languages