Apache AirFlow

Apache Airflow Fundamentals Preparation Course is an on-demand class which dives into the topics covered on the Astronomer Certification: Apache Airflow Fundamentals. The purpose of the course is to prepare you as well as possible for the exam. Apache Airflow is a platform created by community to programmatically author, schedule and monitor workflows. It is scalable, dynamic, extensible and modulable. Without any doubts, mastering Airflow is becoming a must-have and an attractive skill for anyone working with data.

img
request

Can’t find a batch you were looking for?

 

Apache Airflow Fundamentals Preparation Course is an on-demand class which dives into the topics covered on the Astronomer Certification: Apache Airflow Fundamentals. The purpose of the course is to prepare you as well as possible for the exam. Apache Airflow is a platform created by community to programmatically author, schedule and monitor workflows. It is scalable, dynamic, extensible and modulable. Without any doubts, mastering Airflow is becoming a must-have and an attractive skill for anyone working with data.

 
Course Objectives:
 

In this module, you will:

 
  • Coding Production Grade Data pipelines by Mastering Airflow through Hands-on Examples
  • How to Follow Best Practices with Apache Airflow
  • How to Scale Airflow with the Local, Celery and Kubernetes Wxecutors
  • How to Set Up Monitoring with Elasticsearch and Grafana
  • How to Secure Airflow with authentication, crypto and the RBAC UI
  • Core and Advanced Concepts with Pros and Limitations
  • Mastering DAGs with timezones, unit testing, backfill and catchup
  • Organising the DAG folder and keep things clean
 

Course content

 

Introduction 
  • Installing Docker
  • Installing the Astro CLI
  • Running Airflow 2.0 with the Astro CLI
The Essentials
  • Why Airflow?
  • What is Airflow?
  • Core Components
  • 2 Common Architectures
  • Core Concepts
  • Task Lifecycle
  • Installing Apache Airflow
  • Extras and Providers
  • Upgrading Apache Airflow
Interacting with Apache Airflow
  • The 3 ways
  • DAGs View
  • Tree View
  • Graph View
  • Gantt View
  • Interacting with Tasks
  • The Commands to Know
  • The REST API
The forex Data Pipeline
  • Introduction
  • Docker reminder
  • Docker performances
  • Project: The Forex Data Pipeline
  • A bit more about the architecture
  • What is a DAG?
  • Lab: Define your DAG
  • What is an Operator?
DAGs and Tasks
  • DAG Skeleton
  • Demystifying DAG Scheduling
  • Playing with the start_date
  • Playing with the schedule_interval
  • Backfilling and Catchup
  • Focus on Operators
  • Executing Python Functions
  • Putting your DAG on hold
  • Executing Bash Commands
  • Define the path!
  • Exchanging Data
  • ..We got a failure
Distributing Apache AirFlow
  • Introduction
  • Sequential Executor with SQLite
  • Local Executor with PostgreSQL
  • Lab: Executing tasks in parallel with the Local Executor
  • Lab: Ad Hoc Queries with the metadata database
  • Scale out Apache Airflow with Celery Executors and Redis
  • Lab: Set up the Airflow cluster with Celery Executors and Docker
  • Lab: Distributing your tasks with the Celery Executor
  • Lab: Adding new worker nodes with the Celery Executor
  • Lab: Sending tasks to a specific worker with Queues
  • Lab: Pools and priority_weights: Limiting parallelism – prioritizing tasks
  • Kubernetes Reminder
  • Scaling Airflow with Kubernetes Executors
  • Lab: Set up a 3 nodes Kubernetes Cluster with Vagrant and Rancher
  • Lab: Installing Airflow with Rancher and the Kubernetes Executor
  • Lab: Running your DAGs with the Kubernetes Executor
Improving your DAGs with Advanced Concepts
  • Introduction
  • Minimising Repetitive Patterns With SubDAGs
  • Lab: Grouping your tasks with SubDAGs and Deadlocks
  • Making different paths in your DAGs with Branching
  • Lab: Make Your First Conditional Task Using Branching
  • Trigger rules for your tasks
  • Lab: Changing how your tasks are triggered
  • Avoid hard coding values with Variables, Macros and Templates
  • Lab: Templating your tasks
  • How to share data between your tasks with XCOMs
  • Lab: Sharing (big?) data with XCOMs
  • TriggerDagRunOperator or when your DAG controls another DAG
  • Lab: Trigger a DAG from another DAG
  • Dependencies between your DAGs with the ExternalTaskSensor
  • Lab: Make your DAGs dependent with the ExternalTaskSensor
Deploying Airflow on AWS EKS with Kubernetes Executors and Rancher
  • Introduction
  • Quick overview of AWS EKS
  • Lab: Set up an EC2 instance for Rancher
  • Lab: Create an IAM User with permissions
  • Lab: Create an ECR repository
  • Lab: Create an EKS cluster with Rancher
  • How to access your applications from the outside
  • Lab: Deploy Nginx Ingress with Catalogs (Helm)
  • Lab: Deploy and run Airflow with the Kubernetes Executor on EKS
  • Lab: Cleaning your AWS services
Monitoring Apache Airflow
  • Introduction
  • How the logging system works in Airflow
  • Lab: Setting up custom logging
  • Lab: Storing your logs in AWS S3
  • Elasticsearch Reminder
  • Lab: Configuring Airflow with Elasticsearch
  • Lab: Monitoring your DAGs with Elasticsearch
  • Introduction to metrics
  • Lab: Monitoring Airflow with TIG stack
  • Lab: Triggering alerts for Airflow with Grafana
  • Airflow maintenance DAGs
Security in Apache Airflow
  • Introduction
  • Lab: Encrypting sensitive data with Fernet
  • Lab: Rotating the Fernet Key
  • Lab: Hiding variables
  • Lab: Password authentication and filter by owner
  • Lab: RBAC UI
The Executor Kingdom
  • The Default Executor
  • Concurrency, The parameters you must know!
  • Start Scaling Apache Airflow
  • Scaling to the Infinity!

 

To see the full course content Download now

Course Prerequisites

 

We recommend that attendees of this course have:

 
  • Notions of Docker and Python
  • Virtual Box installed (Only for local Kubernetes cluster part)
  • Vagrant installed
  • The course "The Complete Hands-On Introduction to Apache Airflow" can be a nice plus.

Who can attend

 
  • Data Engineers
  • DevOps
  • Software Engineers
  • Data Scientists

Number of Hours: 30hrs

Certification

Apache Airflow Fundamentals

Key features

  • One to One Training
  • Online Training
  • Fastrack & Normal Track
  • Resume Modification
  • Mock Interviews
  • Video Tutorials
  • Materials
  • Real Time Projects
  • Virtual Live Experience
  • Preparing for Certification

FAQs

DASVM Technologies offers 300+ IT training courses with 10+ years of Experienced Expert level Trainers.

  • One to One Training
  • Online Training
  • Fastrack & Normal Track
  • Resume Modification
  • Mock Interviews
  • Video Tutorials
  • Materials
  • Real Time Projects
  • Materials
  • Preparing for Certification

Call now: +91-99003 49889 and know the exciting offers available for you!

We working and coordinating with the companies exclusively to get placed. We have a placement cell focussing on training and placements in Bangalore. Our placement cell help more than 600+ students per year.

Learn from experts active in their field, not out-of-touch trainers. Leading practitioners who bring current best practices and case studies to sessions that fit into your work schedule. We have a pool of experts and trainers are composed with highly skilled and experienced in supporting you in specific tasks and provide professional support. 24x7 Learning support from mentors and a community of like-minded peers to resolve any conceptual doubts. Our trainers has contributed in the growth of our clients as well as professionals.

All of our highly qualified trainers are industry experts with at least 10-12 years of relevant teaching experience. Each of them has gone through a rigorous selection process which includes profile screening, technical evaluation, and a training demo before they are certified to train for us. We also ensure that only those trainers with a high alumni rating continue to train for us.

No worries. DASVM technologies assure that no one misses single lectures topics. We will reschedule the classes as per your convenience within the stipulated course duration with all such possibilities. If required you can even attend that topic with any other batches.

DASVM Technologies provides many suitable modes of training to the students like:

  • Classroom training
  • One to One training
  • Fast track training
  • Live Instructor LED Online training
  • Customized training

Yes, the access to the course material will be available for lifetime once you have enrolled into the course.

You will receive DASVM Technologies recognized course completion certification & we will help you to crack global certification with our training.

Yes, DASVM Technologies provides corporate trainings with Course Customization, Learning Analytics, Cloud Labs, Certifications, Real time Projects with 24x7 Support.

Yes, DASVM Technologies provides group discounts for its training programs. Depending on the group size, we offer discounts as per the terms and conditions.

We accept all major kinds of payment options. Cash, Card (Master, Visa, and Maestro, etc), Wallets, Net Banking, Cheques and etc.

DASVM Technologies has a no refund policy. Fees once paid will not be refunded. If the candidate is not able to attend a training batch, he/she is to reschedule for a future batch. Due Date for Balance should be cleared as per date given. If in case trainer got cancelled or unavailable to provide training DASVM will arrange training sessions with other backup trainer.

Your access to the Support Team is for lifetime and will be available 24/7. The team will help you in resolving queries, during and after the course.

Please Contact our course advisor +91-99003 49889. Or you can share your queries through info@dasvmtechnologies.com

like our courses