1: Create a virtual environment (recommended)
to avoid permission issues and library versions:
$ python3 -m venv airflow_env && source airflow_env/bin/activate
and to update Pip and tools
$ pip install --upgrade pip setuptools wheel
2: Install Airflow base, without extras or constraints (just to be able to generate versions):
$ pip install "apache-airflow==3.1.0"
This installs the Airflow core with versions that Pip considers compatible with Python 3.10, in my case.
3: Generate the constraints file using "pip freeze" to capture the exact versions of all installed dependencies:
$ pip freeze > my_constraints.txt
This creates a my_constraints.txt file with a listing like this:
...
apache-airflow==3.1.0
click==8.1.3
jinja2==3.1.2
sqlalchemy==2.1.0
etc...
This file will serve as constraints, and future installations will use exactly those versions.
4: Install Airflow with extras using our brand new constraints file
Now with the custom constraints, install Airflow with the extras you want (AWS, Celery, etc.):
$ pip install "apache-airflow[amazon]==3.1.0" --constraint ./my_constraints.txt
pip will take the versions from the my_constraints.txt file, won't try to download constraints from GitHub, and obviously won't give a 429 error.
So now I have:
- Airflow 3.1.0 working on Python 3.10
- AWS extras installed, and
- A local constraints file that I can always use to reproduce installations
TATÁÁÁÁ!
Let me know how it goes!
.png)
 1 day ago
                                1
                        1 day ago
                                1
                     
  

