SlideShare une entreprise Scribd logo
1  sur  28
1
The Future of Apache Hadoop Security
Joey Echeverria, Chief Architect of Public Sector
©2014 Cloudera, Inc. All rights reserved.
©2014 Cloudera, Inc. All rights reserved.13
Hadoop | had(y)ōōp |
noun
a system for executing arbitrary binaries over
arbitrary, often large datasets: we used Hadoop
to count an exabyte of words.
©2014 Cloudera, Inc. All rights reserved.27
Joey Echeverria
joey@cloudera.com
@fwiffo
28

Contenu connexe

Plus de Cloudera, Inc.

Data Driven With the Cloudera Modern Data Warehouse 3.19.19
Data Driven With the Cloudera Modern Data Warehouse 3.19.19Data Driven With the Cloudera Modern Data Warehouse 3.19.19
Data Driven With the Cloudera Modern Data Warehouse 3.19.19Cloudera, Inc.
 
Introducing Cloudera DataFlow (CDF) 2.13.19
Introducing Cloudera DataFlow (CDF) 2.13.19Introducing Cloudera DataFlow (CDF) 2.13.19
Introducing Cloudera DataFlow (CDF) 2.13.19Cloudera, Inc.
 
Introducing Cloudera Data Science Workbench for HDP 2.12.19
Introducing Cloudera Data Science Workbench for HDP 2.12.19Introducing Cloudera Data Science Workbench for HDP 2.12.19
Introducing Cloudera Data Science Workbench for HDP 2.12.19Cloudera, Inc.
 
Shortening the Sales Cycle with a Modern Data Warehouse 1.30.19
Shortening the Sales Cycle with a Modern Data Warehouse 1.30.19Shortening the Sales Cycle with a Modern Data Warehouse 1.30.19
Shortening the Sales Cycle with a Modern Data Warehouse 1.30.19Cloudera, Inc.
 
Leveraging the cloud for analytics and machine learning 1.29.19
Leveraging the cloud for analytics and machine learning 1.29.19Leveraging the cloud for analytics and machine learning 1.29.19
Leveraging the cloud for analytics and machine learning 1.29.19Cloudera, Inc.
 
Modernizing the Legacy Data Warehouse – What, Why, and How 1.23.19
Modernizing the Legacy Data Warehouse – What, Why, and How 1.23.19Modernizing the Legacy Data Warehouse – What, Why, and How 1.23.19
Modernizing the Legacy Data Warehouse – What, Why, and How 1.23.19Cloudera, Inc.
 
Leveraging the Cloud for Big Data Analytics 12.11.18
Leveraging the Cloud for Big Data Analytics 12.11.18Leveraging the Cloud for Big Data Analytics 12.11.18
Leveraging the Cloud for Big Data Analytics 12.11.18Cloudera, Inc.
 
Modern Data Warehouse Fundamentals Part 3
Modern Data Warehouse Fundamentals Part 3Modern Data Warehouse Fundamentals Part 3
Modern Data Warehouse Fundamentals Part 3Cloudera, Inc.
 
Modern Data Warehouse Fundamentals Part 2
Modern Data Warehouse Fundamentals Part 2Modern Data Warehouse Fundamentals Part 2
Modern Data Warehouse Fundamentals Part 2Cloudera, Inc.
 
Modern Data Warehouse Fundamentals Part 1
Modern Data Warehouse Fundamentals Part 1Modern Data Warehouse Fundamentals Part 1
Modern Data Warehouse Fundamentals Part 1Cloudera, Inc.
 
Extending Cloudera SDX beyond the Platform
Extending Cloudera SDX beyond the PlatformExtending Cloudera SDX beyond the Platform
Extending Cloudera SDX beyond the PlatformCloudera, Inc.
 
Federated Learning: ML with Privacy on the Edge 11.15.18
Federated Learning: ML with Privacy on the Edge 11.15.18Federated Learning: ML with Privacy on the Edge 11.15.18
Federated Learning: ML with Privacy on the Edge 11.15.18Cloudera, Inc.
 
Analyst Webinar: Doing a 180 on Customer 360
Analyst Webinar: Doing a 180 on Customer 360Analyst Webinar: Doing a 180 on Customer 360
Analyst Webinar: Doing a 180 on Customer 360Cloudera, Inc.
 
Build a modern platform for anti-money laundering 9.19.18
Build a modern platform for anti-money laundering 9.19.18Build a modern platform for anti-money laundering 9.19.18
Build a modern platform for anti-money laundering 9.19.18Cloudera, Inc.
 
Introducing the data science sandbox as a service 8.30.18
Introducing the data science sandbox as a service 8.30.18Introducing the data science sandbox as a service 8.30.18
Introducing the data science sandbox as a service 8.30.18Cloudera, Inc.
 
Introducing Workload XM 8.7.18
Introducing Workload XM 8.7.18Introducing Workload XM 8.7.18
Introducing Workload XM 8.7.18Cloudera, Inc.
 
Get started with Cloudera's cyber solution
Get started with Cloudera's cyber solutionGet started with Cloudera's cyber solution
Get started with Cloudera's cyber solutionCloudera, Inc.
 
Spark and Deep Learning Frameworks at Scale 7.19.18
Spark and Deep Learning Frameworks at Scale 7.19.18Spark and Deep Learning Frameworks at Scale 7.19.18
Spark and Deep Learning Frameworks at Scale 7.19.18Cloudera, Inc.
 
Cloud Data Warehousing with Cloudera Altus 7.24.18
Cloud Data Warehousing with Cloudera Altus 7.24.18Cloud Data Warehousing with Cloudera Altus 7.24.18
Cloud Data Warehousing with Cloudera Altus 7.24.18Cloudera, Inc.
 

Plus de Cloudera, Inc. (20)

Data Driven With the Cloudera Modern Data Warehouse 3.19.19
Data Driven With the Cloudera Modern Data Warehouse 3.19.19Data Driven With the Cloudera Modern Data Warehouse 3.19.19
Data Driven With the Cloudera Modern Data Warehouse 3.19.19
 
Introducing Cloudera DataFlow (CDF) 2.13.19
Introducing Cloudera DataFlow (CDF) 2.13.19Introducing Cloudera DataFlow (CDF) 2.13.19
Introducing Cloudera DataFlow (CDF) 2.13.19
 
Introducing Cloudera Data Science Workbench for HDP 2.12.19
Introducing Cloudera Data Science Workbench for HDP 2.12.19Introducing Cloudera Data Science Workbench for HDP 2.12.19
Introducing Cloudera Data Science Workbench for HDP 2.12.19
 
Shortening the Sales Cycle with a Modern Data Warehouse 1.30.19
Shortening the Sales Cycle with a Modern Data Warehouse 1.30.19Shortening the Sales Cycle with a Modern Data Warehouse 1.30.19
Shortening the Sales Cycle with a Modern Data Warehouse 1.30.19
 
Leveraging the cloud for analytics and machine learning 1.29.19
Leveraging the cloud for analytics and machine learning 1.29.19Leveraging the cloud for analytics and machine learning 1.29.19
Leveraging the cloud for analytics and machine learning 1.29.19
 
Modernizing the Legacy Data Warehouse – What, Why, and How 1.23.19
Modernizing the Legacy Data Warehouse – What, Why, and How 1.23.19Modernizing the Legacy Data Warehouse – What, Why, and How 1.23.19
Modernizing the Legacy Data Warehouse – What, Why, and How 1.23.19
 
Leveraging the Cloud for Big Data Analytics 12.11.18
Leveraging the Cloud for Big Data Analytics 12.11.18Leveraging the Cloud for Big Data Analytics 12.11.18
Leveraging the Cloud for Big Data Analytics 12.11.18
 
Modern Data Warehouse Fundamentals Part 3
Modern Data Warehouse Fundamentals Part 3Modern Data Warehouse Fundamentals Part 3
Modern Data Warehouse Fundamentals Part 3
 
Modern Data Warehouse Fundamentals Part 2
Modern Data Warehouse Fundamentals Part 2Modern Data Warehouse Fundamentals Part 2
Modern Data Warehouse Fundamentals Part 2
 
Modern Data Warehouse Fundamentals Part 1
Modern Data Warehouse Fundamentals Part 1Modern Data Warehouse Fundamentals Part 1
Modern Data Warehouse Fundamentals Part 1
 
Extending Cloudera SDX beyond the Platform
Extending Cloudera SDX beyond the PlatformExtending Cloudera SDX beyond the Platform
Extending Cloudera SDX beyond the Platform
 
Federated Learning: ML with Privacy on the Edge 11.15.18
Federated Learning: ML with Privacy on the Edge 11.15.18Federated Learning: ML with Privacy on the Edge 11.15.18
Federated Learning: ML with Privacy on the Edge 11.15.18
 
Analyst Webinar: Doing a 180 on Customer 360
Analyst Webinar: Doing a 180 on Customer 360Analyst Webinar: Doing a 180 on Customer 360
Analyst Webinar: Doing a 180 on Customer 360
 
Build a modern platform for anti-money laundering 9.19.18
Build a modern platform for anti-money laundering 9.19.18Build a modern platform for anti-money laundering 9.19.18
Build a modern platform for anti-money laundering 9.19.18
 
Introducing the data science sandbox as a service 8.30.18
Introducing the data science sandbox as a service 8.30.18Introducing the data science sandbox as a service 8.30.18
Introducing the data science sandbox as a service 8.30.18
 
Cloudera SDX
Cloudera SDXCloudera SDX
Cloudera SDX
 
Introducing Workload XM 8.7.18
Introducing Workload XM 8.7.18Introducing Workload XM 8.7.18
Introducing Workload XM 8.7.18
 
Get started with Cloudera's cyber solution
Get started with Cloudera's cyber solutionGet started with Cloudera's cyber solution
Get started with Cloudera's cyber solution
 
Spark and Deep Learning Frameworks at Scale 7.19.18
Spark and Deep Learning Frameworks at Scale 7.19.18Spark and Deep Learning Frameworks at Scale 7.19.18
Spark and Deep Learning Frameworks at Scale 7.19.18
 
Cloud Data Warehousing with Cloudera Altus 7.24.18
Cloud Data Warehousing with Cloudera Altus 7.24.18Cloud Data Warehousing with Cloudera Altus 7.24.18
Cloud Data Warehousing with Cloudera Altus 7.24.18
 

Dernier

Test Automation Strategy for Frontend and Backend
Test Automation Strategy for Frontend and BackendTest Automation Strategy for Frontend and Backend
Test Automation Strategy for Frontend and BackendArshad QA
 
Project Based Learning (A.I).pptx detail explanation
Project Based Learning (A.I).pptx detail explanationProject Based Learning (A.I).pptx detail explanation
Project Based Learning (A.I).pptx detail explanationkaushalgiri8080
 
A Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docxA Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docxComplianceQuest1
 
why an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfwhy an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfjoe51371421
 
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...gurkirankumar98700
 
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfLearn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfkalichargn70th171
 
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideBuilding Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideChristina Lin
 
Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...OnePlan Solutions
 
Hand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxHand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxbodapatigopi8531
 
Salesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantSalesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantAxelRicardoTrocheRiq
 
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...soniya singh
 
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...ICS
 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVshikhaohhpro
 
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...stazi3110
 
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdfThe Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdfkalichargn70th171
 
Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlan’s ...
Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlan’s ...Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlan’s ...
Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlan’s ...OnePlan Solutions
 
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AI
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AISyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AI
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AIABDERRAOUF MEHENNI
 
What is Binary Language? Computer Number Systems
What is Binary Language?  Computer Number SystemsWhat is Binary Language?  Computer Number Systems
What is Binary Language? Computer Number SystemsJheuzeDellosa
 

Dernier (20)

Test Automation Strategy for Frontend and Backend
Test Automation Strategy for Frontend and BackendTest Automation Strategy for Frontend and Backend
Test Automation Strategy for Frontend and Backend
 
Project Based Learning (A.I).pptx detail explanation
Project Based Learning (A.I).pptx detail explanationProject Based Learning (A.I).pptx detail explanation
Project Based Learning (A.I).pptx detail explanation
 
A Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docxA Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docx
 
why an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfwhy an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdf
 
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
 
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfLearn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
 
Vip Call Girls Noida ➡️ Delhi ➡️ 9999965857 No Advance 24HRS Live
Vip Call Girls Noida ➡️ Delhi ➡️ 9999965857 No Advance 24HRS LiveVip Call Girls Noida ➡️ Delhi ➡️ 9999965857 No Advance 24HRS Live
Vip Call Girls Noida ➡️ Delhi ➡️ 9999965857 No Advance 24HRS Live
 
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideBuilding Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
 
Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...
 
Hand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxHand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptx
 
Salesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantSalesforce Certified Field Service Consultant
Salesforce Certified Field Service Consultant
 
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
 
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTV
 
Exploring iOS App Development: Simplifying the Process
Exploring iOS App Development: Simplifying the ProcessExploring iOS App Development: Simplifying the Process
Exploring iOS App Development: Simplifying the Process
 
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
 
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdfThe Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
 
Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlan’s ...
Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlan’s ...Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlan’s ...
Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlan’s ...
 
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AI
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AISyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AI
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AI
 
What is Binary Language? Computer Number Systems
What is Binary Language?  Computer Number SystemsWhat is Binary Language?  Computer Number Systems
What is Binary Language? Computer Number Systems
 

The Future of Apache Hadoop Security

  • 1. 1 The Future of Apache Hadoop Security Joey Echeverria, Chief Architect of Public Sector ©2014 Cloudera, Inc. All rights reserved.
  • 2.
  • 3.
  • 4.
  • 5.
  • 6.
  • 7.
  • 8.
  • 9.
  • 10.
  • 11.
  • 12.
  • 13. ©2014 Cloudera, Inc. All rights reserved.13 Hadoop | had(y)ōōp | noun a system for executing arbitrary binaries over arbitrary, often large datasets: we used Hadoop to count an exabyte of words.
  • 14.
  • 15.
  • 16.
  • 17.
  • 18.
  • 19.
  • 20.
  • 21.
  • 22.
  • 23.
  • 24.
  • 25.
  • 26.
  • 27. ©2014 Cloudera, Inc. All rights reserved.27 Joey Echeverria joey@cloudera.com @fwiffo
  • 28. 28

Notes de l'éditeur

  1. You probably came to hear me talk about security, but security is boring. So instead, I’m going to talk about hungry hungry hippos. If you haven’t played hungry hungry hippos before, it’s a fairly simple game where four players compete to collect the most marbles. Now hungry hungry hippos is usually played with all white marbles. This makes sense because all players can collect all marbles. But I’m going to change the rules. Image source: https://www.flickr.com/photos/carbonnyc/3234684182
  2. BOOM, multiple colors. That’s more like it. Now that I have all this great variety, I want to restrict which hippos can consume which colors of marbles. Why do I want this new rule? Well, it doesn’t matter because I’m the one giving the talk so I get to make the rules. Image source: https://www.flickr.com/photos/andrewmorrell/55268996
  3. Although seriously, do you want any of these yahoos to be able to collect any color of marble? Take the guy in the green, I never trust a man with a beard. Now that we’ve established that we want to limit access to certain color marbles, let’s brainstorm a couple of different ways to implement this. Image source: https://www.flickr.com/photos/timefortea3/12938376425/
  4. Lets start by sorting the marbles into groups with the same color. We can then control access to these groups of marbles by creating magical boundaries that only specific hippos can penetrate. This system works well, but it’s not very granular. We have to pre-group the marbles that each hippo can access into it’s own magic box. Image source: source: https://www.flickr.com/photos/stevendepolo/5601377451
  5. If some magic is good, more magic is even better! I’d much rather have magic marbles and magic hippos. Instead of having to first put the same colored marbles into the same magic box, now I can mix all the marbles together. Thanks to magic, each hippo will only grab the colors that they’re allowed. Any other marbles will pass right through them. This saves me a lot of time and it also allows me to invite more players. Now anyone can play and they’ll only ever collect the right kind of marble. Image source: https://www.flickr.com/photos/jdhancock/7082879485
  6. This is convenient because I’m very lazy. I’d hate to have to pre-sort things. It’s much easier for me to grab a handful of marbles and just throw them on the board and let the magic sort it all out. Image source: https://www.flickr.com/photos/tambako/633374069
  7. At this point you may be asking what on earth does any of this have to do with Hadoop? Well, you might be asking that assuming you didn’t just walk out while I was up here rambling about hippos, marbles, and magic. If you think about how the usage of Hadoop has evolved, it started very much the same way as our hungry hungry hippos game. All of the marbles were the same color and any player could collect any marble. This was great when we were deploying Hadoop for a small set of users and we trusted every user with all of the data. But when something is useful, it inevitably leads to more adoption. Image source: https://www.flickr.com/photos/secretlondon/4582476286
  8. As more and more people show up to use our cluster, we have to think more and more carefully about who has access to what data. Before Hadoop had strong security controls, they implemented advisory authorization at the file and directory level. I say advisory because while permissions existed, Hadoop initially didn’t require that you prove who you say you are. This helped you prevent mistakes, but didn’t stop malicious users. Image source: https://www.flickr.com/photos/scott-s_photos/12712204375
  9. Hadoop solved this problem by adding support for Kerberos-based authentication. Now each user was given strong credentials that they could use to gain access to the system. Kerberos has become so synonymous with Hadoop security that 90% of the time if someone says they configured or enabled hadoop security, they’re probably talking about turning on Kerberos authentication. Image source: https://www.flickr.com/photos/16048742@N08/3458184491
  10. This is great. You could now check that each user was who they said they were. You still have a bit of a problem in that permissions only exist at the file level. That means that if I want control access to particular types of data, I have to merge all of the protected data into their own files and set permissions accordingly. This is especially annoying if I’m using a query language like Impala, Hive or Pig. I’m now trying access tables of records but I have to manage my security controls with very blunt tools. Image source: https://www.flickr.com/photos/ballena/4167217995
  11. Before I talk about how we’ve made progress on the granularity problem, I want to take a minute to talk about how Hadoop, and in particular MapReduce, implements process-based isolation. Before Hadoop had security, every job was executed as the hadoop user. This meant that even though you accessed files by default using the user identity that submitted the job, there was nothing that prevented you from peaking over your shoulder and looking at some of the output from another job since all of the intermediate data was protected by OS permissions and all jobs ran as the same OS user. Hadoop solved this problem by adding the ability to su to the user that submitted the job before executing the job process. This is very useful from a security perspective, but I bring it up because it’s something that often trips up new administrators that are deploying Hadoop for the first time. TLDR; you must provision user accounts on every node in your cluster for any user that can run a MapReduce job. This is often done using LDAP or Active Directory so you don’t have to manage all the accounts by hand. Image source: https://www.flickr.com/photos/mkamp/2429091134
  12. At it’s core, Hadoop is a system for executing arbitrary code over arbitrary data. Let me say that one more time. Arbitrary code running over arbitrary data. This is why Hadoop security is tougher than most other systems. The system starts with the ability to just run random code, you need to set up multiple barriers of protection before you have a fully secured system.
  13. Why do we care about controlling data access with finer granularity? It all comes down to multitenancy. It’s cool to have a 100 node Hadoop cluster that serves all of the users in your department, but it’s even cooler to have a 1000 node Hadoop cluster that serves all of the users in your company. Because we want to share these large clusters to get good economies of scale, we need to come up with more creative ways to control access to data. Again. we could keep sorting data into files and directories and implementing all of our controls at the file-level, but that gets old fast. Image source: https://www.flickr.com/photos/erozkosz/6003136440
  14. One of the first efforts towards adding fine grained access control was a project called Apache Accumulo. Accumulo is similar to HBase in that it’s also based on Google’s BigTable design. One of the places Accumulo departed from the source material was to add an additional element to the key that provides a security label at the cell-level. This is very useful as it provides very fine-grain access, unfortunately the scanning speed of Accumulo and HBase isn’t as fast as scanning HDFS directly so they’re not as ideal for large batch workloads (hint: that’s what MapReduce was designed for). HBase initially added security at the table and column level, but they saw how much fun Accumulo was having that they added a cell-level option in the latest release. Image source: https://www.flickr.com/photos/28096801@N05/4332863749
  15. Not wanting to be left out of the party, the community has done some work to prove fine-grain access control to data stored in HDFS. The two most popular ways of doing that today is with Apache Sentry (Incubating) and Apache Hive’s new, next generation authorization features . Sentry works by plugging into existing projects and adding RBAC from the outside. This is nice because it gives you a common way of controlling access to data across the different file format and processing engines. Today, Sentry supports Hive, Impala and Apache Solr. Sentry only provides access control down to the view level, but you can simulate column and even row-level access by creating views that expose a subset of columns or that filter rows. The downside to these methods is that today, you only get the access control if you’re accessing your data through one of the supported engines. You can’t have granular access through both Hive and regular MapReduce. Image source: https://www.flickr.com/photos/47217301@N06/5175262042
  16. Because setting up Hadoop security is still somewhat complex, a recent trend has been to try and get all of your security at the perimeter. This isn’t necessarily a new idea. Folks have been using Hue for a number of years to provide limited access at the boundary. What’s different is that more and more users are looking for perimeter controlled API access, not just a user-facing GUI. The most popular dedicated project in this area is Apache Knox. Knox is nice in that it lets you grant access to select users through a proxy service. The downside is that Knox implements it’s own REST APIs so you can’t just take a standard Hadoop client and point it at Knox. The other limit of perimeter security is that if you’re allowed to upload jar files and submit jobs, then you still need all of the other security features enabled to prevent jobs from running amok. Image source: https://www.flickr.com/photos/sidibousaid/6985757255
  17. This brings me to the topic of trust. By default, all of that data in Hadoop is stored in the clear. That means that you have to trust the system administrators that have root access to the cluster to not go poking around in data they shouldn’t. You also trust that your network is secure and that malicious users can capture or sniff traffic that wasn’t meant for them. These assumptions are fine for a large number of users, but they don’t satisfy the most paranoid among us. Image source: https://www.flickr.com/photos/powerbooktrance/466709245
  18. For these users, Hadoop supports encryption in a number of different places. Today, it is largely available for data and metadata that goes over the wire. You can encrypt Hadoop’s RPC protocol, the data block streaming protocol, and the MapReduce shuffle. This encryption is implemented with SASL for RPC and block streaming which limits the encryption codec options to a certain degree. Shuffle encryption is implemented with SSL and so it supports what ever cipher suites are available in Java’s SSL implementation. Image source: https://www.flickr.com/photos/greeblemonkey/2689378015
  19. Over the wire is great and all, but what about disk-based encryption? Today, Hadoop doesn’t support native disk-level encryption. Historically, users that needed to encrypt their data on disk used third party tools that would encrypt the data volumes. HDFS-6134 is ongoing upstream to add native encryption at rest to HDFS and should come out in a future release. You may have also seen the announcement that Cloudera has bought Gazzang and we’re supporting their Hadoop encryption solution on our stack. When HDFS-6134 is merged in, you’ll be able to do block-level encryption on HDFS for a pre-specified list of files and directories. The coolest part is the the key management is pluggable so you’ll be able to use whichever system best meets your needs. Pluggable, scalable encryption for all your big data needs. Yes please. Image source: https://www.flickr.com/photos/kubina/326629513
  20. So, what does the future hold? Well, if you’ve been following along you see a number of themes that pushing Hadoop’s security boundaries. Hadoop clusters are getting larger and more diverse user bases. Increasingly, deployments are no longer trusting the network or administrators that are running your cluster. Folks are accessing ever increasing volumes of data and with more and more diverse processing engines. Image source: https://www.flickr.com/photos/pasukaru76/3998273279
  21. This brings us back to granularity. HBase has already added cell-level security, will HDFS follow suite? Obviously it’s harder for a file system to protect objects that are contained with-in the files themselves, so more creative solutions will likely be necessary. HDFS is already increasing the flexibility for protecting files and directories. Hadoop 2.4.0 added support for file system access control lists (ACLs). This means you’re no longer limited by the POSIX-style permissions that have been around since Hadoop 0.16. Image source: https://www.flickr.com/photos/dnet/5921138809
  22. I’m personally excited for the future of Sentry. Sentry already supports more projects than any of the other fine grain access control solutions out there. You can expect to see it integrate with more and more processing engines in the future. There is already extensive work being done to add a DB-based backend to Sentry (SENTRY-37) which will be more sustainable than the configuration file used today. That work will also enable the ability to use SQL-based GRANT/REVOKE commands to update permissions and roles. In keeping with the granularity theme, there is also a proposal to add true column-level access control to Sentry. This will eliminate the need to create views in order to simulate column-level access. But what I’m most excited about are plans to add a Sentry Record Service on top of HDFS. This would abstract away the access to files on the file system with a distributed services for accessing records. This service will be accessible from any processing engine and any level of security, including cell-level, could be implemented. The design work is still early and subject to change, but APIs are planned to be fully layered so you could add new security operations like transparent encryption directly into the service. This is going to be huge. Image source: https://www.flickr.com/photos/24874528@N04/6863197796
  23. I already spoke a bit about what encryption features are available today and what’s in the pipeline. A big proponent of the additional encryption technologies is Intel through it’s Project Rhino initiative. Through Rhino, a number of encryption and other security related enhancements have been proposed and worked in upstream Hadoop. One of the biggest benefits to the work being completed under Rhino is support for accelerated encryption codecs when running on certain Intel chips. This will enable large scale deployments of encryption without huge performance penalties. For it’s part, Cloudera is doubling down on Rhino and investing heavily in the security of Apache Hadoop. Image source: https://www.flickr.com/photos/gwegner/8247098628
  24. I want to leave you with this final thought. All of the work that has been done to add security to Hadoop and all of the work that’s coming down the pipeline is there to enhance your ability to share data. This is the key idea behind Hadoop. You can argue about which processing engine to use or even which file system, but Hadoop as an idea is bigger than all of them. Hadoop, for the first time, gives us a platform where you can store all of your data, access it in whatever way makes the most sense, and can do so while sharing huge clusters among thousands of users. Security is really at the heart of this. All of the great capabilities of Hadoop are for naught if it won’t let you securely share and access your data. Image source: https://www.flickr.com/photos/ben_grey/4582294721
  25. So, play a game and have some fun. Image source: https://www.flickr.com/photos/davef3138/2685563950