Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Note
Apache Airflow job is powered by Apache Airflow.
Fabric Data Factory offers a powerful set of APIs that make it easy to automate and manage your Apache Airflow Jobs. You can connect to different data sources and services, and build, update, or monitor your workflows with just a few lines of code. The APIs cover everything from creating and editing Apache Airflow Jobs to tracking them — so you can keep your data flowing smoothly without the hassle.
API use cases for Apache Airflow Jobs
The APIs for Apache Airflow Jobs in Fabric Data Factory can be used in various scenarios:
- Automated deployment: Automate the deployment of Apache Airflow Jobs across different environments (development, testing, production) using CI/CD practices.
- Monitoring and alerts: Set up automated monitoring and alerting systems to track the status of Apache Airflow Jobs and receive notifications if failures or performance issues occur.
- Error handling: Implement custom error handling and retry mechanisms to ensure Apache Airflow Jobs run smoothly and recover from failures.
Understanding APIs
To effectively use the APIs for Apache Airflow Jobs in Fabric Data Factory, it's essential to understand the key concepts and components:
- Endpoints: The API endpoints provide access to various Apache Airflow Job operations, such as creating, updating, and deleting Apache Airflow Jobs.
- Authentication: Secure access to the APIs using authentication mechanisms like OAuth or API keys.
- Requests and responses: Understand the structure of API requests and responses, including the required parameters and expected output.
- Rate limits: Be aware of the rate limits imposed on API usage to avoid exceeding the allowed number of requests.
CRUD support
CRUD stands for Create, Read, Update, and Delete, which are the four basic operations that can be performed on data. In Fabric Data Factory, the CRUD operations are supported through the Fabric API for Data Factory. These APIs allow users to manage their Apache Airflow Jobs programmatically. Here are some key points about CRUD support:
- Create: Create new Apache Airflow Jobs using the API.
- Read: Retrieve information about existing Apache Airflow Jobs.
- Update: Update existing Apache Airflow Jobs.
- Delete: Delete Apache Airflow Jobs that are no longer needed.
The primary online reference documentation for Microsoft Fabric REST APIs can be found in the Microsoft Fabric REST API documentation.
Additional APIs offered in Apache Airflow Jobs
In addition to CRUD APIs, there are a series of additional operational APIs offered for Apache Airflow Jobs:
- Job File Management APIs
Get started with REST APIs for Apache Airflow Jobs
The following documentations outlines how to create, update, and manage Apache Airflow Jobs and operational use cases using the Fabric Data Factory APIs.
Obtain an authorization token
Before you use the other REST APIs, you need to have the bearer token.
Important
In the following examples, ensure the word 'Bearer ' (with a space) precedes the access token itself. When using an API client and selecting 'Bearer Token' as the authentication type, 'Bearer ' is automatically inserted for you, and only requires the access token to be provided.
Option 1: Using MSAL.Net
Refer to the Get Token section of the Fabric API quickstart as an example of how to obtain the MSAL authorization token.
Use MSAL.Net to acquire a Microsoft Entra ID token for Fabric service with the following scopes: Workspace.ReadWrite.All, Item.ReadWrite.All. For more information about token acquisition with MSAL.Net to, see Token Acquisition - Microsoft Authentication Library for .NET.
Copy the token from the AccessToken property and replace the <access-token> placeholder in the following examples with the token.
Option 2: Using the Fabric portal
Sign in to the Fabric portal for the Tenant you want to test on, and press F12 to enter the browser's developer mode. In the console there, run:
powerBIAccessToken
Copy the token and replace the <access-token> placeholder in the following examples with the token.
Create an Apache Airflow Job
Create an Apache Airflow Job in a specified workspace.
Sample request:
URI: POST https://api.fabric.microsoft.com/v1/workspaces/{workspaceId}/items
Headers:
{
"Authorization": "Bearer <access-token>",
"Content-Type": "application/json"
}
Payload:
{
"displayName": "My Apache Airflow Job",
"description": "My Apache Airflow Job description",
"type": "ApacheAirflowJobs"
}
Sample response:
{
"id": "<artifactId>",
"type": "ApacheAirflowJobs",
"displayName": "My Apache Airflow Job",
"description": "My Apache Airflow Job description",
"workspaceId": "<workspaceId>"
}
Create an Apache Airflow Job with definition
Create an Apache Airflow Job with a public definition in a specified workspace.
Sample request:
URI: POST https://api.fabric.microsoft.com/v1/workspaces/{workspaceId}/items
Headers:
{
"Authorization": "Bearer <access-token>",
"Content-Type": "application/json"
}
Payload:
{
"displayName": " My Apache Airflow Job",
"description": "My Apache Airflow Job description",
"type": "ApacheAirflowJobs",
"definition": {
"parts": [
{
"path": "ApacheAirflowJob.json",
"payload": "{apacheAirflowJobPayload}",
"payloadType": "InlineBase64"
},
{
"path": ".platform",
"payload": "{apacheAirflowJobPayload}",
"payloadType": "InlineBase64"
}
]
}
}
Sample response:
{
"id": "<Your artifactId>",
"type": "ApacheAirflowJobs",
"displayName": "My Apache Airflow Job",
"description": "My Apache Airflow Job description",
"workspaceId": "<Your workspaceId>"
}
Get Apache Airflow Job
Returns properties of specified Apache Airflow Job.
Sample request:
URI: GET https://api.fabric.microsoft.com/v1/workspaces/{workspaceId}/items/{itemId}
Headers:
{
"Authorization": "Bearer <access-token>"
}
Sample response:
{
"id": "<Your artifactId>",
"type": "ApacheAirflowJobs",
"displayName": "My Apache Airflow Job",
"description": "My Apache Airflow Job description",
"workspaceId": "<Your workspaceId>"
}
Get Apache Airflow Job with definition
Returns the Apache Airflow Job item definition.
Sample request:
URI: POST https://api.fabric.microsoft.com/v1/workspaces/{workspaceId}/items/{itemId}/getDefinition
Headers:
{
"Authorization": "Bearer <access-token>"
}
Sample response:
{
"definition": {
"parts": [
{
"path": "ApacheAirflowJob.json",
"payload": "{apacheAirflowJobPayload}",
"payloadType": "InlineBase64"
},
{
"path": ".platform",
"payload": "{apacheAirflowJobPayload}",
"payloadType": "InlineBase64"
}
]
}
}
Update Apache Airflow Job
Updates the properties of the Apache Airflow Job.
Sample request:
URI: PATCH https://api.fabric.microsoft.com/v1/workspaces/{workspaceId}/items/{itemId}
Headers:
{
"Authorization": "Bearer <access-token>",
"Content-Type": "application/json"
}
Payload:
{
"displayName": "My Apache Airflow Job updated",
"description": "My Apache Airflow Job description updated",
"type": "ApacheAirflowJobs"
}
Sample response:
{
"id": "<Your artifactId>",
"type": "ApacheAirflowJobs",
"displayName": "My Apache Airflow Job updated",
"description": "My Apache Airflow Job description updated",
"workspaceId": "<Your workspaceId>"
}
Update Apache Airflow Job with definition
Updates the Apache Airflow Job item definition.
Sample request:
URI: POST https://api.fabric.microsoft.com/v1/workspaces/{workspaceId}/items/{itemId}/updateDefinition
Headers:
{
"Authorization": "Bearer <access-token>",
"Content-Type": "application/json"
}
Payload:
{
"displayName": "My Apache Airflow Job",
"type": "ApacheAirflowJobs",
"definition": {
"parts": [
{
"path": "ApacheAirflowJob.json",
"payload": "{apacheAirflowJobPayload}",
"payloadType": "InlineBase64"
},
{
"path": ".platform",
"payload": "{apacheAirflowJobPayload}",
"payloadType": "InlineBase64"
}
]
}
}
Sample response:
200 OK
Delete Apache Airflow Job
Deletes the specified Apache Airflow Job.
Sample request:
URI: DELETE https://api.fabric.microsoft.com/v1/workspaces/{workspaceId}/items/{itemId}
Headers:
{
"Authorization": "Bearer <access-token>"
}
Sample response:
200 OK
Job File Management APIs
Get Apache Airflow Job File
Returns job file from Apache Airflow by path.
Request URI: GET https://api.fabric.microsoft.com/v1/workspaces/{workspaceId}/apacheairflowjobs/{apacheAirflowJobId}/files/{filePath}
Sample Results:
200 OK
Create/Update Apache Airflow Job File
Creates or updates an Apache Airflow Job file.
Request URI: PUT https://api.fabric.microsoft.com/v1/workspaces/{workspaceId}/apacheairflowjobs/{apacheAirflowJobId}/files/{filePath}
Request Payload:
Binary
PYTHON files (DAGs), should be UTF-8 encoded
Sample Results:
200 OK
Delete Apache Airflow Job File
Deletes the specified Apache Airflow Job file.
Request URI: DELETE https://api.fabric.microsoft.com/v1/workspaces/{workspaceId}/apacheairflowjobs/{apacheAirflowJobId}/files/{filePath}
Sample Results:
200 OK
List Apache Airflow Job Files
Lists the files the specified Apache Airflow Job file.
Request URI: GET https://api.fabric.microsoft.com/v1/workspaces/{workspaceId}/apacheairflowjobs/{apacheAirflowJobId}/files?rootPath=”my_folder”&continuationToken={token}
Note that rootPath and continutationToken are optional.
Sample Results:
{
"files": [
{ filePath:string, sizeInBytes: int },
],
"continuationToken": "LDEsMTAwMDAwLDA%3D "
"continuationUri": "https://api.fabric.microsoft.com/v1/workspaces/{workspaceId}/apacheairflowjobs/{apacheAirflowJobId}/files?continuationToken='LDEsMTAwMDAwLDA%3D'"
}
Service Principal Name (SPN) Support
Service Principal Name (SPN) is a security identity feature used by applications or services to access specific resources. In Fabric Data Factory, SPN support is crucial for enabling secure and automated access to data sources. Here are some key points about SPN support:
- Authentication: SPNs are used to authenticate applications or services when accessing data sources. This ensures that only authorized entities can access the data.
- Configuration: To use SPNs, you need to create a service principal in Azure and grant it the necessary permissions to access the data source. For example, if you're using a data lake, the service principal needs storage blob data reader access.
- Connection: When setting up a data connection in Fabric Data Factory, you can choose to authenticate using a service principal. This involves providing the tenant ID, client ID, and client secret of the service principal.
- Security: Using SPNs enhances security by avoiding the use of hardcoded credentials in your dataflows. It also allows for better management of access permissions and auditing of access activities.
For more detailed information on how to set up and use SPNs in Fabric Data Factory, refer to SPN support in Data Factory.
Related content
Refer to the following content for more information on APIs in Apache Airflow Jobs in Fabric Data Factory: