Skip to content

Bump apache-airflow from 2.2.4 to 3.2.0#271

Open
dependabot[bot] wants to merge 1 commit intomasterfrom
dependabot/pip/apache-airflow-3.2.0
Open

Bump apache-airflow from 2.2.4 to 3.2.0#271
dependabot[bot] wants to merge 1 commit intomasterfrom
dependabot/pip/apache-airflow-3.2.0

Conversation

@dependabot
Copy link
Copy Markdown
Contributor

@dependabot dependabot Bot commented on behalf of github Apr 29, 2026

Bumps apache-airflow from 2.2.4 to 3.2.0.

Release notes

Sourced from apache-airflow's releases.

Apache Airflow 3.2.0

📦 PyPI: https://pypi.org/project/apache-airflow/3.2.0/ 📚 Docs: https://airflow.apache.org/docs/apache-airflow/3.2.0/ 🛠 Release Notes: https://airflow.apache.org/docs/apache-airflow/3.2.0/release_notes.html 🐳 Docker Image: "docker pull apache/airflow:3.2.0" 🚏 Constraints: https://github.com/apache/airflow/tree/constraints-3.2.0

Significant Changes

Asset Partitioning

The headline feature of Airflow 3.2.0 is asset partitioning — a major evolution of data-aware scheduling. Instead of triggering Dags based on an entire asset, you can now schedule downstream processing based on specific partitions of data. Only the relevant slice of data triggers downstream work, making pipeline orchestration far more efficient and precise.

This matters when working with partitioned data lakes — date-partitioned S3 paths, Hive table partitions, BigQuery table partitions, or any other partitioned data store. Previously, any update to an asset triggered all downstream Dags regardless of which partition changed. Now only the right work gets triggered at the right time.

For detailed usage instructions, see :doc:/authoring-and-scheduling/assets.

Multi-Team Deployments

Airflow 3.2 introduces multi-team support, allowing organizations to run multiple isolated teams within a single Airflow deployment. Each team can have its own Dags, connections, variables, pools, and executors— enabling true resource and permission isolation without requiring separate Airflow instances per team.

This is particularly valuable for platform teams that serve multiple data engineering or data science teams from shared infrastructure, while maintaining strong boundaries between teams' resources and access.

For detailed usage instructions, see :doc:/core-concepts/multi-team.

.. warning::

Multi-Team Deployments are experimental in 3.2.0 and may change in future versions based on user feedback.

Synchronous callback support for Deadline Alerts

Deadline Alerts now support synchronous callbacks via SyncCallback in addition to the existing asynchronous AsyncCallback. Synchronous callbacks are executed by the executor (rather than the triggerer), and can optionally target a specific executor via the executor parameter.

A Dag can also define multiple Deadline Alerts by passing a list to the deadline parameter, and each alert can use either callback type.

.. warning::

Deadline Alerts are experimental in 3.2.0 and may change in future versions based on

... (truncated)

Changelog

Sourced from apache-airflow's changelog.

.. Licensed to the Apache Software Foundation (ASF) under one or more contributor license agreements. See the NOTICE file distributed with this work for additional information regarding copyright ownership. The ASF licenses this file to you under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at

.. http://www.apache.org/licenses/LICENSE-2.0

.. Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

Dockerfile Changelog

The Dockerfile does not strictly follow the SemVer <https://semver.org/>_ approach of Apache Airflow when it comes to features and backwards compatibility. While Airflow code strictly follows it, the Dockerfile is really a way to give users a conveniently packaged Airflow using standard container approach, so occasionally there are some changes in the building process or in the entrypoint of the image that require slight adaptation of how it is used or built.

The Changelog below describes the changes introduced in each version of the docker images released by the Airflow team.

:note: The Changelog below concerns only the convenience production images released at Airflow DockerHub <https://hub.docker.com/r/apache/airflow>_ . The images that are released there are usually built using the Dockerfile released together with Airflow. However, you are free to take latest released Dockerfile from Airflow and use it to build an image for any Airflow version from the Airflow 2 line. There is no guarantee that it will work, but if it does, then you can use latest features from that image to build images for previous Airflow versions.

Airflow 3.1.4


In Airflow 3.1.4, the images are build without removing of .pyc and .pyo files when Python is built.
This increases the size of the image slightly (<0.5%), but improves performance of Python in the container
because Python does not need to recompile the files on the first run but more importantly, if you use
``exec`` to run Health Checks, removed .pyc files caused a small but ever growing memory leak in the Unix
kernel connected to negative ``dentries`` created when .pyc files were attempted to be compiled and failed.
This over time could lead to out-of-memory issues on the host running the container.

More information about dentries can be found in this article &lt;https://lwn.net/Articles/814535/&gt;_.

Airflow 3.1.0

... (truncated)

Commits
  • 3bc3ccf Update release notes for 3.2.0rc2
  • 5311961 [v3-2-test] Guard against null trigger in asset watcher cleanup (#64659) (#64...
  • 3840fae [v3-2-test] Fix ObjectStoragePath NoCredentialsError when using conn_id with ...
  • 959ebd8 [v3-2-test] Fix double-serialization issue by unwrapping serialized kwargs in...
  • d1d2416 [v3-2-test] Run DB check only for core components in prod entrypoint (#63413)...
  • beee8b6 [v3-2-test] Remove false-positive RFC3986 underscore warning from Connection....
  • 68d9874 [v3-2-test] fix: restore early return in check_for_write_conflict (#64062) (#...
  • 0a000c4 [v3-2-test] Fix deferred task resume failure when worker is older than server...
  • 8d0fb4b [v3-2-test] Fix serde deserialization of old-format builtin types in trigger ...
  • e64b7e4 [v3-2-test] Fix: Restore live stdout logging for Elasticsearch log forwarding...
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    You can disable automated security fix PRs for this repo from the Security Alerts page.

Bumps [apache-airflow](https://github.com/apache/airflow) from 2.2.4 to 3.2.0.
- [Release notes](https://github.com/apache/airflow/releases)
- [Changelog](https://github.com/apache/airflow/blob/main/docker-stack-docs/changelog.rst)
- [Commits](apache/airflow@2.2.4...3.2.0)

---
updated-dependencies:
- dependency-name: apache-airflow
  dependency-version: 3.2.0
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot Bot added dependencies Pull requests that update a dependency file python Pull requests that update python code labels Apr 29, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file python Pull requests that update python code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants