- Create a sample python/shell script
cd /home/zetaris/airflow/dags
touch testpy.py
chmod 775 testpy.pyThe content inside the testpy.py file is pasted below:
def my_python_function():
print('hello')
my_python_function() -
Create a DAG python file to inform airflow that we are trying to schedule testpy.py
cd /home/zetaris/airflow/dags
touch dag_1.py
chmod 775 dag_1.pyThe content inside the dag_1.py file is pasted below:
#step1 - Importing Modules
from airflow import DAG
from datetime import datetime, timedelta
from airflow.operators.bash import BashOperator
from airflow.operators.python import PythonOperator
#step2 - Define Default Arguments
default_args={
'Owner':'airflow',
'depends_on_past':False,
'start_date':datetime(2022,2,11),
'retries':0
}
#step3 - Instatiate the DAG
dag=DAG(dag_id='DAG-1',default_args=default_args,catchup=False,schedule_interval='@once')
#step4 - Define Tasks
#start=DummyOperator(task_id='start',dag = dag)
#end=DummyOperator(task_id='end',dag=dag)
t1 = BashOperator(
task_id='testairflow',
bash_command='python3 /home/zetaris/airflow/dags/testpy.py',
dag=dag)
#step5 - Define Dependencies
#start >> end -
If you wish to run the webserver and scheduler overnight, use the below commands to start the webserver and the scheduler.
nohup airflow webserver &
<Press Enter Key>
nohup airflow scheduler & - Refresh the airflow UI, a dag named ‘DAG_1’ should appear on the top of the list. Click on the toggle button on the left to unpause the DAG.
Image: DAGs on Airflow GUI - Trigger the DAG and check the output
Click on the execute/play button the right side of 'DAG_1'
Click on DAG_1
Click on Graph and then on Task id, in our example 'testairflow'
Click on LogIf the script executes successfully, we should be able to see ‘hello’ as the output in the logs:
Image: Logs of DAGS Execution on Airflow UI