SlideShare une entreprise Scribd logo
1  sur  33
www.clairvoyantsoft.com
Airflow
A CLAIRVOYANT Story
Quick Poll
| 2
| 3
Robert Sanders
Big Data Manager and Engineer
Shekhar Vemuri
CTO
Shekhar works with clients across various industries and
helps define data strategy, and lead the implementation of
data engineering and data science efforts.
Was Co-founder and CTO of Blue Canary, a Predictive
analytics solution to help with student retention, Blue
Canary was later Acquired by Blackboard in 2015.
One of the early employees of Clairvoyant, Robert
primarily works with clients to enable them along their
big data journey. Robert has deep background in web
and enterprise systems, working on full-stack
implementations and then focusing on Data
management platforms.
| 4
About
Background Awards & Recognition
Boutique consulting firm centered on building data solutions and
products
All things Web and Data Engineering, Analytics, ML and User
Experience to bring it all together
Support core Hadoop platform, data engineering pipelines and
provide administrative and devops expertise focused on Hadoop
| 5
Currently working on building a data security solution to help enterprises
discover, secure and monitor sensitive data in their environment.
| 6
● What is Apache Airflow?
○ Features
○ Architecture
● Use Cases
● Lessons Learned
● Best Practices
● Scaling & High Availability
● Deployment, Management & More
● Questions
Agenda
| 7
Hey Robert, I heard about this new
hotness that will solve all of our
workflow scheduling and
orchestration problems. I played
with it for 2 hours and I am in love!
Can you try it out?
Must be pretty cool. I
wonder how it compares
to what we’re using. I’ll
check it out!
Genesis
| 8
Building Expertise vs Yak
Shaving
| 9
| 10
● Mostly used Cron and Oozie
● Did some crazy things with Java and Quartz in a past life
● Lot of operational support was going into debugging Oozie workloads and issues we ran into
with that
○ 4+ Years of working with Oozie “built expertise??”
● Needed a scalable, open source, user friendly engine for
○ Internal product needs
○ Client engagements
○ Making our Ops and Support teams lives easier
Why?
| 11
Scheduler Landscape
| 12
● “Apache Airflow is an Open Source platform to programmatically Author, Schedule and Monitor workflows”
○ Workflows as Python Code (this is huge!!!!!)
○ Provides monitoring tools like alerts and a web interface
● Written in Python
● Apache Incubator Project
○ Joined Apache Foundation in early 2016
○ https://github.com/apache/incubator-airflow/
What is Apache Airflow?
| 13
● Lightweight Workflow Platform
● Full blown Python scripts as DSL
● More flexible execution and workflow generation
● Feature Rich Web Interface
● Worker Processes can Scale Horizontally and Vertically
● Extensible
Why use Apache Airflow?
| 14
Building Blocks
| 15
Different Executors
● SequentialExecutor
● LocalExecutor
● CeleryExecutor
● MesosExecutor
● KubernetesExecutor (Coming Soon)
Executors
What are Executors?
Executors are the mechanism by which task
instances get run.
| 16
Single Node Deployment
| 17
Multi-Node Deployment
| 18
# Library Imports
from airflow.models import DAG
from airflow.operators import BashOperator
from datetime import datetime, timedelta
# Define global variables and default arguments
START_DATE = datetime.now() - timedelta(minutes=1)
default_args = dict(
'owner'='Airflow',
'retries'=1,
'retry_delay'=timedelta(minutes=5),
)
# Define the DAG
dag = DAG('dag_id', default_args=default_args, schedule_interval='0 0 * * *’, start_date=START_DATE)
# Define the Tasks
task1 = BashOperator(task_id='task1', bash_command="echo 'Task 1'", dag=dag)
task2 = BashOperator(task_id='task2', bash_command="echo 'Task 2'", dag=dag)
task3 = BashOperator(task_id='task3', bash_command="echo 'Task 3'", dag=dag)
# Define the Task Relationships
task1.set_downstream(task2)
task2.set_downstream(task3)
Defining a Workflow
| 19
dag = DAG('dag_id', …)
last_task = None
for i in range(1, 3):
task = BashOperator(
task_id='task' + str(i),
bash_command="echo 'Task" + str(i) + "'",
dag=dag)
if last_task is None:
last_task = task
else:
last_task.set_downstream(task)
last_task = task
Defining a Dynamic Workflow
| 20
● Action Operators
○ BashOperator(bash_command))
○ SSHOperator(ssh_hook, ssh_conn_id, remote_host, command)
○ PythonOperator(python_callable=python_function)
● Transfer Operators
○ HiveToMySqlTransfer(sql, mysql_table, hiveserver2_conn_id, mysql_conn_id, mysql_preoperator, mysql_postoperator, bulk_load)
○ MySqlToHiveTransfer(sql, hive_table, create, recreate, partition, delimiter, mysql_conn_id, hive_cli_conn_id, tblproperties)
● Sensor Operators
○ HdfsSensor(filepath, hdfs_conn_id, ignored_ext, ignore_copying, file_size, hook)
○ HttpSensor(endpoint, http_conn_id, method, request_params, headers, response_check, extra_options)
Many More
Operators
| 21
● Kogni discovers sensitive data across all data sources enterprise
● Need to configure scans with various schedules, work standalone or with a spark cluster
● Orchestrate, execute and manage dozens of pipelines that scan and ingest data in a secure
fashion
● Needed a tool to manage this outside of the core platform
● Started with exporting Oozie configuration from the core app - but conditional aspects and
visibility became an issue
● Needed something that supported deep DAGs and Broad DAGs
First Use Case (Description)
| 22
● Daily ETL Batch Process to Ingest data into Hadoop
○ Extract
■ 1226 tables from 23 databases
○ Transform
■ Impala scripts to join and transform data
○ Load
■ Impala scripts to load data into common final tables
● Other requirements
○ Make it extensible to allow the client to import more databases and tables in the future
○ Status emails to be sent out after daily job to report on success and failures
● Solution
○ Create a DAG that dynamically generates the workflow based off data in a Metastore
Second Use Case (Description)
| 23
Second Use Case (Architecture)
| 24
Second Use Case (DAG)
1,000 ft view 100,000 ft view
| 25
● Support
● Documentation
● Bugs and Odd Behavior
● Monitor Performance with Charts
● Tune Retries
● Tune Parallelism
Lessons Learned
| 26
● Load Data Incrementally
● Process Historic Data with Backfill operations
● Enforce Idempotency (retry safe)
● Execute Conditionally (BranchPythonOperator, ShortCuircuitOperator)
● Alert if there are failures (task failures and SLA misses) (Email/Slack)
● Use Sensor Operators to determine when to Start a Task (if applicable)
● Build Validation into your Workflows
● Test as much - but needs some thought
Best Practices
| 27
Scaling & High Availability
| 28
High Availability for the Scheduler
Scheduler Failover Controller: https://github.com/teamclairvoyant/airflow-scheduler-failover-controller
| 29
● PIP Install airflow site packages on all Nodes
● Set AIRFLOW_HOME env variable before setup
● Utilize MySQL or PostgreSQL as a Metastore
● Update Web App Port
● Utilize SystemD or Upstart Scripts (https://github.com/apache/incubator-
airflow/tree/master/scripts)
● Set Log Location
○ Local File System, S3 Bucket, Google Cloud Storage
● Daemon Monitoring (Nagios)
● Cloudera Manager CSD (Coming Soon)
Deployment & Management
| 30
● Web App Authentication
○ Password
○ LDAP
○ OAuth: Google, GitHub
● Role Based Access Control (RBAC) (Coming Soon)
● Protect airflow.cfg (expose_config, read access to airflow.cfg)
● Web App SSL
● Kerberos Ticket Renewer
Security
| 31
● PyUnit - Unit Testing
● Test DAG Tasks Individually
airflow test [--subdir SUBDIR] [--dry_run] [--task_params
TASK_PARAMS_JSON] dag_id task_id execution_date
● Airflow Unit Test Mode - Loads configurations from the unittests.cfg file
[tests]
unit_test_mode = true
● Always at the very least ensure that the DAG is valid (can be done as part of CI)
● Take it a step ahead by mock pipeline testing(with inputs and outputs) (especially if your DAGs
are broad)
Testing
Questions?
| 32
We are hiring!
| 33
@shekharv
shekhar@clairvoyant.ai
linkedin.com/in/shekharvemuri
@rssanders3
robert@clairvoyant.ai
linkedin.com/in/robert-sanders-cs

Contenu connexe

Tendances

How I learned to time travel, or, data pipelining and scheduling with Airflow
How I learned to time travel, or, data pipelining and scheduling with AirflowHow I learned to time travel, or, data pipelining and scheduling with Airflow
How I learned to time travel, or, data pipelining and scheduling with Airflow
PyData
 

Tendances (20)

Airflow introduction
Airflow introductionAirflow introduction
Airflow introduction
 
Apache airflow
Apache airflowApache airflow
Apache airflow
 
Apache Airflow
Apache AirflowApache Airflow
Apache Airflow
 
Building an analytics workflow using Apache Airflow
Building an analytics workflow using Apache AirflowBuilding an analytics workflow using Apache Airflow
Building an analytics workflow using Apache Airflow
 
Airflow presentation
Airflow presentationAirflow presentation
Airflow presentation
 
Apache Airflow
Apache AirflowApache Airflow
Apache Airflow
 
Apache Airflow
Apache AirflowApache Airflow
Apache Airflow
 
Intro to Airflow: Goodbye Cron, Welcome scheduled workflow management
Intro to Airflow: Goodbye Cron, Welcome scheduled workflow managementIntro to Airflow: Goodbye Cron, Welcome scheduled workflow management
Intro to Airflow: Goodbye Cron, Welcome scheduled workflow management
 
Apache Airflow Introduction
Apache Airflow IntroductionApache Airflow Introduction
Apache Airflow Introduction
 
Orchestrating workflows Apache Airflow on GCP & AWS
Orchestrating workflows Apache Airflow on GCP & AWSOrchestrating workflows Apache Airflow on GCP & AWS
Orchestrating workflows Apache Airflow on GCP & AWS
 
Airflow Intro-1.pdf
Airflow Intro-1.pdfAirflow Intro-1.pdf
Airflow Intro-1.pdf
 
Apache Airflow Architecture
Apache Airflow ArchitectureApache Airflow Architecture
Apache Airflow Architecture
 
Airflow tutorials hands_on
Airflow tutorials hands_onAirflow tutorials hands_on
Airflow tutorials hands_on
 
Airflow presentation
Airflow presentationAirflow presentation
Airflow presentation
 
How I learned to time travel, or, data pipelining and scheduling with Airflow
How I learned to time travel, or, data pipelining and scheduling with AirflowHow I learned to time travel, or, data pipelining and scheduling with Airflow
How I learned to time travel, or, data pipelining and scheduling with Airflow
 
Building Better Data Pipelines using Apache Airflow
Building Better Data Pipelines using Apache AirflowBuilding Better Data Pipelines using Apache Airflow
Building Better Data Pipelines using Apache Airflow
 
Building a Data Pipeline using Apache Airflow (on AWS / GCP)
Building a Data Pipeline using Apache Airflow (on AWS / GCP)Building a Data Pipeline using Apache Airflow (on AWS / GCP)
Building a Data Pipeline using Apache Airflow (on AWS / GCP)
 
Running Airflow Workflows as ETL Processes on Hadoop
Running Airflow Workflows as ETL Processes on HadoopRunning Airflow Workflows as ETL Processes on Hadoop
Running Airflow Workflows as ETL Processes on Hadoop
 
From airflow to google cloud composer
From airflow to google cloud composerFrom airflow to google cloud composer
From airflow to google cloud composer
 
MeetUp Monitoring with Prometheus and Grafana (September 2018)
MeetUp Monitoring with Prometheus and Grafana (September 2018)MeetUp Monitoring with Prometheus and Grafana (September 2018)
MeetUp Monitoring with Prometheus and Grafana (September 2018)
 

Similaire à Apache Airflow in Production

DevOps for TYPO3 Teams and Projects
DevOps for TYPO3 Teams and ProjectsDevOps for TYPO3 Teams and Projects
DevOps for TYPO3 Teams and Projects
Fedir RYKHTIK
 
adaidoadaoap9dapdadadjoadjoajdoiajodiaoiao
adaidoadaoap9dapdadadjoadjoajdoiajodiaoiaoadaidoadaoap9dapdadadjoadjoajdoiajodiaoiao
adaidoadaoap9dapdadadjoadjoajdoiajodiaoiao
lyvanlinh519
 

Similaire à Apache Airflow in Production (20)

Thinking DevOps in the era of the Cloud - Demi Ben-Ari
Thinking DevOps in the era of the Cloud - Demi Ben-AriThinking DevOps in the era of the Cloud - Demi Ben-Ari
Thinking DevOps in the era of the Cloud - Demi Ben-Ari
 
Database automation guide - Oracle Community Tour LATAM 2023
Database automation guide - Oracle Community Tour LATAM 2023Database automation guide - Oracle Community Tour LATAM 2023
Database automation guide - Oracle Community Tour LATAM 2023
 
Azure functions: from a function to a whole application in 60 minutes
Azure functions: from a function to a whole application in 60 minutesAzure functions: from a function to a whole application in 60 minutes
Azure functions: from a function to a whole application in 60 minutes
 
DevOps for TYPO3 Teams and Projects
DevOps for TYPO3 Teams and ProjectsDevOps for TYPO3 Teams and Projects
DevOps for TYPO3 Teams and Projects
 
Designing for operability and managability
Designing for operability and managabilityDesigning for operability and managability
Designing for operability and managability
 
Devops with Python by Yaniv Cohen DevopShift
Devops with Python by Yaniv Cohen DevopShiftDevops with Python by Yaniv Cohen DevopShift
Devops with Python by Yaniv Cohen DevopShift
 
Platform as a Service (PaaS) - A cloud service for Developers
Platform as a Service (PaaS) - A cloud service for Developers Platform as a Service (PaaS) - A cloud service for Developers
Platform as a Service (PaaS) - A cloud service for Developers
 
Serverless - DevOps Lessons Learned From Production
Serverless - DevOps Lessons Learned From ProductionServerless - DevOps Lessons Learned From Production
Serverless - DevOps Lessons Learned From Production
 
Geospatial data platform at Uber
Geospatial data platform at UberGeospatial data platform at Uber
Geospatial data platform at Uber
 
Building a Dev/Test Cloud with Apache CloudStack
Building a Dev/Test Cloud with Apache CloudStackBuilding a Dev/Test Cloud with Apache CloudStack
Building a Dev/Test Cloud with Apache CloudStack
 
Nyc mule soft_meetup_13_march_2021
Nyc mule soft_meetup_13_march_2021Nyc mule soft_meetup_13_march_2021
Nyc mule soft_meetup_13_march_2021
 
adaidoadaoap9dapdadadjoadjoajdoiajodiaoiao
adaidoadaoap9dapdadadjoadjoajdoiajodiaoiaoadaidoadaoap9dapdadadjoadjoajdoiajodiaoiao
adaidoadaoap9dapdadadjoadjoajdoiajodiaoiao
 
Monitoring Big Data Systems Done "The Simple Way" - Codemotion Berlin 2017
Monitoring Big Data Systems Done "The Simple Way" - Codemotion Berlin 2017Monitoring Big Data Systems Done "The Simple Way" - Codemotion Berlin 2017
Monitoring Big Data Systems Done "The Simple Way" - Codemotion Berlin 2017
 
MuleSoft Surat Virtual Meetup#16 - Anypoint Deployment Option, API and Operat...
MuleSoft Surat Virtual Meetup#16 - Anypoint Deployment Option, API and Operat...MuleSoft Surat Virtual Meetup#16 - Anypoint Deployment Option, API and Operat...
MuleSoft Surat Virtual Meetup#16 - Anypoint Deployment Option, API and Operat...
 
Uber Geo spatial data platform at DataWorks Summit
Uber Geo spatial data platform at DataWorks SummitUber Geo spatial data platform at DataWorks Summit
Uber Geo spatial data platform at DataWorks Summit
 
Machine learning and big data @ uber a tale of two systems
Machine learning and big data @ uber a tale of two systemsMachine learning and big data @ uber a tale of two systems
Machine learning and big data @ uber a tale of two systems
 
Sprint 78
Sprint 78Sprint 78
Sprint 78
 
Sprint 66
Sprint 66Sprint 66
Sprint 66
 
Monitoring Big Data Systems Done "The Simple Way" - Demi Ben-Ari - Codemotion...
Monitoring Big Data Systems Done "The Simple Way" - Demi Ben-Ari - Codemotion...Monitoring Big Data Systems Done "The Simple Way" - Demi Ben-Ari - Codemotion...
Monitoring Big Data Systems Done "The Simple Way" - Demi Ben-Ari - Codemotion...
 
Monitoring Big Data Systems "Done the simple way" - Demi Ben-Ari - Codemotion...
Monitoring Big Data Systems "Done the simple way" - Demi Ben-Ari - Codemotion...Monitoring Big Data Systems "Done the simple way" - Demi Ben-Ari - Codemotion...
Monitoring Big Data Systems "Done the simple way" - Demi Ben-Ari - Codemotion...
 

Plus de Robert Sanders

Plus de Robert Sanders (6)

Migrating Big Data Workloads to the Cloud
Migrating Big Data Workloads to the CloudMigrating Big Data Workloads to the Cloud
Migrating Big Data Workloads to the Cloud
 
Delivering digital transformation and business impact with io t, machine lear...
Delivering digital transformation and business impact with io t, machine lear...Delivering digital transformation and business impact with io t, machine lear...
Delivering digital transformation and business impact with io t, machine lear...
 
Productionalizing spark streaming applications
Productionalizing spark streaming applicationsProductionalizing spark streaming applications
Productionalizing spark streaming applications
 
Airflow Clustering and High Availability
Airflow Clustering and High AvailabilityAirflow Clustering and High Availability
Airflow Clustering and High Availability
 
Databricks Community Cloud Overview
Databricks Community Cloud OverviewDatabricks Community Cloud Overview
Databricks Community Cloud Overview
 
Intro to Apache Spark
Intro to Apache SparkIntro to Apache Spark
Intro to Apache Spark
 

Dernier

+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
?#DUbAI#??##{{(☎️+971_581248768%)**%*]'#abortion pills for sale in dubai@
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
Joaquim Jorge
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 

Dernier (20)

Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 
HTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation StrategiesHTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation Strategies
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 

Apache Airflow in Production

  • 3. | 3 Robert Sanders Big Data Manager and Engineer Shekhar Vemuri CTO Shekhar works with clients across various industries and helps define data strategy, and lead the implementation of data engineering and data science efforts. Was Co-founder and CTO of Blue Canary, a Predictive analytics solution to help with student retention, Blue Canary was later Acquired by Blackboard in 2015. One of the early employees of Clairvoyant, Robert primarily works with clients to enable them along their big data journey. Robert has deep background in web and enterprise systems, working on full-stack implementations and then focusing on Data management platforms.
  • 4. | 4 About Background Awards & Recognition Boutique consulting firm centered on building data solutions and products All things Web and Data Engineering, Analytics, ML and User Experience to bring it all together Support core Hadoop platform, data engineering pipelines and provide administrative and devops expertise focused on Hadoop
  • 5. | 5 Currently working on building a data security solution to help enterprises discover, secure and monitor sensitive data in their environment.
  • 6. | 6 ● What is Apache Airflow? ○ Features ○ Architecture ● Use Cases ● Lessons Learned ● Best Practices ● Scaling & High Availability ● Deployment, Management & More ● Questions Agenda
  • 7. | 7 Hey Robert, I heard about this new hotness that will solve all of our workflow scheduling and orchestration problems. I played with it for 2 hours and I am in love! Can you try it out? Must be pretty cool. I wonder how it compares to what we’re using. I’ll check it out! Genesis
  • 8. | 8
  • 9. Building Expertise vs Yak Shaving | 9
  • 10. | 10 ● Mostly used Cron and Oozie ● Did some crazy things with Java and Quartz in a past life ● Lot of operational support was going into debugging Oozie workloads and issues we ran into with that ○ 4+ Years of working with Oozie “built expertise??” ● Needed a scalable, open source, user friendly engine for ○ Internal product needs ○ Client engagements ○ Making our Ops and Support teams lives easier Why?
  • 12. | 12 ● “Apache Airflow is an Open Source platform to programmatically Author, Schedule and Monitor workflows” ○ Workflows as Python Code (this is huge!!!!!) ○ Provides monitoring tools like alerts and a web interface ● Written in Python ● Apache Incubator Project ○ Joined Apache Foundation in early 2016 ○ https://github.com/apache/incubator-airflow/ What is Apache Airflow?
  • 13. | 13 ● Lightweight Workflow Platform ● Full blown Python scripts as DSL ● More flexible execution and workflow generation ● Feature Rich Web Interface ● Worker Processes can Scale Horizontally and Vertically ● Extensible Why use Apache Airflow?
  • 15. | 15 Different Executors ● SequentialExecutor ● LocalExecutor ● CeleryExecutor ● MesosExecutor ● KubernetesExecutor (Coming Soon) Executors What are Executors? Executors are the mechanism by which task instances get run.
  • 16. | 16 Single Node Deployment
  • 18. | 18 # Library Imports from airflow.models import DAG from airflow.operators import BashOperator from datetime import datetime, timedelta # Define global variables and default arguments START_DATE = datetime.now() - timedelta(minutes=1) default_args = dict( 'owner'='Airflow', 'retries'=1, 'retry_delay'=timedelta(minutes=5), ) # Define the DAG dag = DAG('dag_id', default_args=default_args, schedule_interval='0 0 * * *’, start_date=START_DATE) # Define the Tasks task1 = BashOperator(task_id='task1', bash_command="echo 'Task 1'", dag=dag) task2 = BashOperator(task_id='task2', bash_command="echo 'Task 2'", dag=dag) task3 = BashOperator(task_id='task3', bash_command="echo 'Task 3'", dag=dag) # Define the Task Relationships task1.set_downstream(task2) task2.set_downstream(task3) Defining a Workflow
  • 19. | 19 dag = DAG('dag_id', …) last_task = None for i in range(1, 3): task = BashOperator( task_id='task' + str(i), bash_command="echo 'Task" + str(i) + "'", dag=dag) if last_task is None: last_task = task else: last_task.set_downstream(task) last_task = task Defining a Dynamic Workflow
  • 20. | 20 ● Action Operators ○ BashOperator(bash_command)) ○ SSHOperator(ssh_hook, ssh_conn_id, remote_host, command) ○ PythonOperator(python_callable=python_function) ● Transfer Operators ○ HiveToMySqlTransfer(sql, mysql_table, hiveserver2_conn_id, mysql_conn_id, mysql_preoperator, mysql_postoperator, bulk_load) ○ MySqlToHiveTransfer(sql, hive_table, create, recreate, partition, delimiter, mysql_conn_id, hive_cli_conn_id, tblproperties) ● Sensor Operators ○ HdfsSensor(filepath, hdfs_conn_id, ignored_ext, ignore_copying, file_size, hook) ○ HttpSensor(endpoint, http_conn_id, method, request_params, headers, response_check, extra_options) Many More Operators
  • 21. | 21 ● Kogni discovers sensitive data across all data sources enterprise ● Need to configure scans with various schedules, work standalone or with a spark cluster ● Orchestrate, execute and manage dozens of pipelines that scan and ingest data in a secure fashion ● Needed a tool to manage this outside of the core platform ● Started with exporting Oozie configuration from the core app - but conditional aspects and visibility became an issue ● Needed something that supported deep DAGs and Broad DAGs First Use Case (Description)
  • 22. | 22 ● Daily ETL Batch Process to Ingest data into Hadoop ○ Extract ■ 1226 tables from 23 databases ○ Transform ■ Impala scripts to join and transform data ○ Load ■ Impala scripts to load data into common final tables ● Other requirements ○ Make it extensible to allow the client to import more databases and tables in the future ○ Status emails to be sent out after daily job to report on success and failures ● Solution ○ Create a DAG that dynamically generates the workflow based off data in a Metastore Second Use Case (Description)
  • 23. | 23 Second Use Case (Architecture)
  • 24. | 24 Second Use Case (DAG) 1,000 ft view 100,000 ft view
  • 25. | 25 ● Support ● Documentation ● Bugs and Odd Behavior ● Monitor Performance with Charts ● Tune Retries ● Tune Parallelism Lessons Learned
  • 26. | 26 ● Load Data Incrementally ● Process Historic Data with Backfill operations ● Enforce Idempotency (retry safe) ● Execute Conditionally (BranchPythonOperator, ShortCuircuitOperator) ● Alert if there are failures (task failures and SLA misses) (Email/Slack) ● Use Sensor Operators to determine when to Start a Task (if applicable) ● Build Validation into your Workflows ● Test as much - but needs some thought Best Practices
  • 27. | 27 Scaling & High Availability
  • 28. | 28 High Availability for the Scheduler Scheduler Failover Controller: https://github.com/teamclairvoyant/airflow-scheduler-failover-controller
  • 29. | 29 ● PIP Install airflow site packages on all Nodes ● Set AIRFLOW_HOME env variable before setup ● Utilize MySQL or PostgreSQL as a Metastore ● Update Web App Port ● Utilize SystemD or Upstart Scripts (https://github.com/apache/incubator- airflow/tree/master/scripts) ● Set Log Location ○ Local File System, S3 Bucket, Google Cloud Storage ● Daemon Monitoring (Nagios) ● Cloudera Manager CSD (Coming Soon) Deployment & Management
  • 30. | 30 ● Web App Authentication ○ Password ○ LDAP ○ OAuth: Google, GitHub ● Role Based Access Control (RBAC) (Coming Soon) ● Protect airflow.cfg (expose_config, read access to airflow.cfg) ● Web App SSL ● Kerberos Ticket Renewer Security
  • 31. | 31 ● PyUnit - Unit Testing ● Test DAG Tasks Individually airflow test [--subdir SUBDIR] [--dry_run] [--task_params TASK_PARAMS_JSON] dag_id task_id execution_date ● Airflow Unit Test Mode - Loads configurations from the unittests.cfg file [tests] unit_test_mode = true ● Always at the very least ensure that the DAG is valid (can be done as part of CI) ● Take it a step ahead by mock pipeline testing(with inputs and outputs) (especially if your DAGs are broad) Testing
  • 33. We are hiring! | 33 @shekharv shekhar@clairvoyant.ai linkedin.com/in/shekharvemuri @rssanders3 robert@clairvoyant.ai linkedin.com/in/robert-sanders-cs

Notes de l'éditeur

  1. Workflow systems used Big data and hadoop Data engineering Cloud
  2. Workflow systems used Big data and hadoop Data engineering Cloud
  3. FINAL
  4. FINAL
  5. FINAL
  6. FINAL
  7. FINAL