Create a DAG for scheduling your first python script

  1. Create a sample python/shell script
    cd /home/zetaris/airflow/dags
    chmod 775

    The content inside the file is pasted below:

    def my_python_function():

  2. Create a DAG python file to inform airflow that we are trying to schedule

    cd /home/zetaris/airflow/dags
    chmod 775

    The content inside the file is pasted below:

    #step1 - Importing Modules
    from airflow import DAG
    from datetime import datetime, timedelta
    from airflow.operators.bash import BashOperator
    from airflow.operators.python import PythonOperator

    #step2 - Define Default Arguments

    #step3 - Instatiate the DAG

    #step4 - Define Tasks
    #start=DummyOperator(task_id='start',dag = dag)
    t1 = BashOperator(
        bash_command='python3 /home/zetaris/airflow/dags/',

    #step5 - Define Dependencies
    #start >> end
  3. If you wish to run the webserver and scheduler overnight, use the below commands to start the webserver and the scheduler.

    nohup airflow webserver &
    <Press Enter Key>
    nohup airflow scheduler &
  4. Refresh the airflow UI, a dag named ‘DAG_1’ should appear on the top of the list. Click on the toggle button on the left to unpause the DAG.Dags on Airflow Ui
                                                       Image: DAGs on Airflow GUI
  5. Trigger the DAG and check the output
    Click on the execute/play button the right side of 'DAG_1'
    Click on DAG_1
    Click on Graph and then on Task id, in our example 'testairflow'
    Click on Log

    If the script executes successfully, we should be able to see ‘hello’ as the output in the logs: Image: Logs of DAGS Execution on Airflow UI