Skip to main content

Airflow issues

Table of Contents

My Airflow DAG configuration is different from the code I deployed

All the DAGs defined in different projects are deployed to the same Airflow server for a given environment. Please make sure to give DAGs in different projects different names to prevent these issues.

I added pools in Airflow and updated my DAG, but my task is still scheduled in the default_pool

If you restart a task in the Airflow UI, the task instance details are not cleared. You can find these task instance details by clicking on the task (as if you would clear it) but then click the button "task instance details" instead of "clear". There you see the Task instance attributes and such, these will not be cleared when you clear the task. Be aware of this, only new tasks will use your pool.

I can't import a file in another file in my dags folder

Suppose you have a project called project1, and we have in the dags folder the following structure:

/dags
/util.py
/dag.py

You want to import the util.py into the dag.py. To be able to do this you need to know that projects are structured in Airflow like this:

/dags
/project1
/project2

Where project1 is the name of your project and project2 is another project deployed onto the same environment. The structure is like this to avoid name clashes between different projects with overlapping file names.

To be able to use the util.py package you will need to import it like this:

from project1.util import *

I can't import my module when the project name contains a hyphen in its name

Let's say you have a project called test-project. The following statement then won't be parsed correctly, since Python package names cannot contain hyphens.

from test-project import utils

You can work around this issue by using the importlib package:

import importlib

utils = importlib.import_module("test-project.utils")