SlideShare une entreprise Scribd logo
1  sur  27
akshath@baabte.com
facebook.com/akshath.kumar180
Twitter.com/akshath4u
in.linkedin.com/in/akshathkumar
HADOOP 2.4 INSTALLATION ON UBUNTU 14.04
Hadoop is a free, Java-based programming framework that supports the
processing of large data sets in a distributed computing environment. It is part
of the Apache project sponsored by the Apache Software Foundation.
In this presentation, we'll install a single-node Hadoop cluster backed by the Hadoop
Distributed File System on Ubuntu.
HADOOP ON 64 BIT UBUNTU 14.04
➢ INSTALLING JAVA
➢ ADDING A DEDICATED HADOOP USER
➢ INSTALLING SSH
➢ CREATE AND SETUP SSH CERTIFICATES
➢ INSTALL HADOOP
➢ SETUP CONFIGURATION FILES
➢ FORMAT THE NEW HADOOP FILESYSTEM
➢ STARTING HADOOP
➢ STOPPING HADOOP
➢ HADOOP WEB INTERFACES
➢ USING HADOOP
STEPS FOR INSTALLING HADOOP
❖ Hadoop framework is written in Java!
➢ Open Terminal in Ubuntu 14.04, and Following the Steps
INSTALLING JAVA
ADDING A DEDICATED HADOOP USER
➔ SSH has 2 main components :
◆ ssh : The command we use to connect to remote machines - the client.
◆ sshd : The daemon that is running on the server and allows clients to connect to
the server.
➔ The ssh is pre-enabled on Linux, but in order to start sshd daemon, we need to install
ssh first. Use this command to do that.
$ sudo apt-get install ssh
INSTALLING SSH
➔ Hadoop requires SSH access to manage its nodes,we therefore need to configure SSH access to
localhost. So, we need to have SSH up and running on our machine and configured it to allow SSH
public key authentication.
CREATE AND SETUP SSH CERTIFICATES
➔ The second command adds the newly created key to the list of authorized keys
so that Hadoop can use ssh without prompting for a password.
◆ We can check if ssh works
CREATE AND SETUP SSH CERTIFICATES
➔ Type the following Command in Ubuntu Terminal.
➔ We want to move the Hadoop installation to the /usr/local/hadoop directory using
the following command
INSTALL HADOOP
$ wget http://mirrors.sonic.net/apache/hadoop/common/hadoop-2.4.1/hadoop-2.4.1.tar.gz
$ tar xvzf hadoop-2.4.1.tar.gz
$ sudo mv hadoop-2.4.1 /usr/local/hadoop
$ sudo chown -R hduser:hadoop hadoop
$ pwd
/usr/local/hadoop
$ ls
bin etc include lib libexec LICENSE.txt NOTICE.txt README.txt sbin
share
➔ The following files will have to be modified to complete the Hadoop
setup
1. ~/.bashrc
2. /usr/local/hadoop/etc/hadoop/hadoop-env.sh
3. /usr/local/hadoop/etc/hadoop/core-site.xml
4. /usr/local/hadoop/etc/hadoop/mapred-site.xml.template
5. /usr/local/hadoop/etc/hadoop/hdfs-site.xml
➔ First of all we need to find the path where Java has been installed to set the
JAVA_HOME environment variable using the following command
SETUP CONFIGURATION FILES
$ update-alternatives --config java
There is only one alternative in link group java (providing /usr/bin/java):
/usr/lib/jvm/java-7-openjdk-amd64/jre/bin/java
Nothing to configure.
1. ~/.bashrc
We can append the following to the end of ~/.bashrc
SETUP CONFIGURATION FILES
2. /usr/local/hadoop/etc/hadoop/hadoop-env.sh
★ We need to set JAVA_HOME by modifying hadoop-env.sh file
export JAVA_HOME=/usr/lib/jvm/java-7-
openjdk-amd64
★ Adding the above statement in the hadoop-env.sh file ensures that the
value of JAVA_HOME variable will be available to Hadoop whenever it
is started up.
SETUP CONFIGURATION FILES
3. /usr/local/hadoop/etc/hadoop/core-site.xm
★ he /usr/local/hadoop/etc/hadoop/core-site.xml file contains configuration
properties that Hadoop uses when starting up. This file can be used to override
the default settings that Hadoop starts with.
$ sudo mkdir -p /app/hadoop/tmp
$ sudo chown hduser:hadoop /app/hadoop/tmp
SETUP CONFIGURATION FILES
Open the file and enter the following in between the <configuration>
</configuration> tag.
SETUP CONFIGURATION FILES
4. /usr/local/hadoop/etc/hadoop/mapred-site.xml
➔ By default, the /usr/local/hadoop/etc/hadoop/ folder contains the
/usr/local/hadoop/etc/hadoop/mapred-site.xml.template file which has to be
renamed/copied with the name mapred-site.xml
$ cp/usr/local/hadoop/etc/hadoop/mapred-site.xml.template
/usr/local/hadoop/etc/hadoop/mapred-site.xml
SETUP CONFIGURATION FILES
➔ The mapred-site.xml file is used to specify which framework is being used for
MapReduce. We need to enter the following content in between the
<configuration></configuration> tag
SETUP CONFIGURATION FILES
5. /usr/local/hadoop/etc/hadoop/hdfs-site.xml
➔ Before editing this file, we need to create two directories which will contain the namenode and the
datanode for this Hadoop installation. This can be done using the following commands
SETUP CONFIGURATION FILES
➔ Open the file and enter the following content in between the <configuration></configuration> tag
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
<description>Default block replication.
The actual number of replications can be specified when the file is created.
The default is used if replication is not specified in create time.
</description>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>file:/usr/local/hadoop_store/hdfs/namenode</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file:/usr/local/hadoop_store/hdfs/datanode</value>
</property>
</configuration>
SETUP CONFIGURATION FILES
➔ Now, the Hadoop filesystem needs to be formatted so that we can start to use it.
The format command should be issued with write permission since it creates
current directory under /usr/local/hadoop_store/hdfs/namenode folder.
hduser@k:~$ hadoop namenode -format
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
…………………………………………………………………………. …………. ………………….. ……………………. ………..
………… …………………. …………………… …………………… ……………………………………. ………………. ………..
14/07/13 22:13:10 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at k/127.0.1.1
************************************************************/
FORMAT THE NEW HADOOP FILESYSTEM
The hadoop namenode -format command should be executed once before we start using Hadoop. If this command is
executed again after Hadoop has been used, it'll destroy all the data on the Hadoop file system.
➔ Now it's time to start the newly installed single node cluster. We can use start-
all.sh or (start-dfs.sh and start-yarn.sh)
hduser@k:/home/k$ start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
14/07/13 23:36:59 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
………………. ………………….. ……………….. . ……. ………………………………….. ……………………
……………………………. ……………………………………………………………….. ……………………………….. ..
localhost: starting nodemanager,
logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-k.out
➔ check if it's really up and running.
hduser@k:/home/k$ jps
➔ Another way to check is using netstat
hduser@k:/home/k$ netstat -plten | grep java
STARTING
HADOOP
➔ Type the following Command into the Terminal for Stopping
the Hadoop
$ pwd
/usr/local/hadoop/sbin
$ ls
distribute-exclude.sh httpfs.sh start-all.sh
----------------------- ------------- ---------- -----------
------------ -- -- - -- -------- - -- - - -- - - - - - ----
start-secure-dns.sh stop-balancer.sh stop-yarn.sh
➔ We run stop-all.sh or (stop-dfs.sh and stop-yarn.sh) to stop all the daemons
running on our machine:
$ /usr/local/hadoop/sbin/stop-all.sh
STOPPING HADOOP
Thank You...
US UK UAE
7002 Hana Road,
Edison NJ 08817,
United States of America.
90 High Street,
Cherry Hinton,
Cambridge, CB1 9HZ,
United Kingdom.
Suite No: 51, Oasis Center,
Sheikh Zayed Road, Dubai,
UAE
Email to info@baabtra.com or Visit baabtra.com
Looking for learning more about the above
topic?
India Centres
Emarald Mall (Big Bazar Building)
Mavoor Road, Kozhikode,
Kerala, India.
Ph: + 91 – 495 40 25 550
NC Complex, Near Bus Stand
Mukkam, Kozhikode,
Kerala, India.
Ph: + 91 – 495 40 25 550
Cafit Square IT Park,
Hilite Business Park,
Kozhikode
Kerala, India.
Email: info@baabtra.com
TBI - NITC
NIT Campus, Kozhikode.
Kerala, India.
Start up Village
Eranakulam,
Kerala, India.
Start up Village
UL CC
Kozhikode, Kerala
Follow us @ twitter.com/baabtra
Like us @ facebook.com/baabtra
Subscribe to us @ youtube.com/baabtra
Become a follower @ slideshare.net/BaabtraMentoringPartner
Connect to us @ in.linkedin.com/in/baabtra
Give a feedback @ massbaab.com/baabtra
Thanks in advance
www.baabtra.com | www.massbaab.com |www.baabte.com
Want to learn more about programming or Looking to become a good programmer?
Are you wasting time on searching so many contents online?
Do you want to learn things quickly?
Tired of spending huge amount of money to become a Software professional?
Do an online course
@ baabtra.com
We put industry standards to practice. Our structured, activity based courses are so designed
to make a quick, good software professional out of anybody who holds a passion for coding.

Contenu connexe

Tendances

Running hadoop on ubuntu linux
Running hadoop on ubuntu linuxRunning hadoop on ubuntu linux
Running hadoop on ubuntu linux
TRCK
 
Hadoop Installation and basic configuration
Hadoop Installation and basic configurationHadoop Installation and basic configuration
Hadoop Installation and basic configuration
Gerrit van Vuuren
 
Set up Hadoop Cluster on Amazon EC2
Set up Hadoop Cluster on Amazon EC2Set up Hadoop Cluster on Amazon EC2
Set up Hadoop Cluster on Amazon EC2
IMC Institute
 
Hadoop 20111117
Hadoop 20111117Hadoop 20111117
Hadoop 20111117
exsuns
 
Out of the Box Replication in Postgres 9.4(pgconfsf)
Out of the Box Replication in Postgres 9.4(pgconfsf)Out of the Box Replication in Postgres 9.4(pgconfsf)
Out of the Box Replication in Postgres 9.4(pgconfsf)
Denish Patel
 
Install hadoop in a cluster
Install hadoop in a clusterInstall hadoop in a cluster
Install hadoop in a cluster
Xuhong Zhang
 
Empacotamento e backport de aplicações em debian
Empacotamento e backport de aplicações em debianEmpacotamento e backport de aplicações em debian
Empacotamento e backport de aplicações em debian
Andre Ferraz
 

Tendances (17)

Apache Hadoop & Hive installation with movie rating exercise
Apache Hadoop & Hive installation with movie rating exerciseApache Hadoop & Hive installation with movie rating exercise
Apache Hadoop & Hive installation with movie rating exercise
 
Running hadoop on ubuntu linux
Running hadoop on ubuntu linuxRunning hadoop on ubuntu linux
Running hadoop on ubuntu linux
 
Hadoop Installation
Hadoop InstallationHadoop Installation
Hadoop Installation
 
Hadoop Installation and basic configuration
Hadoop Installation and basic configurationHadoop Installation and basic configuration
Hadoop Installation and basic configuration
 
Drupal from scratch
Drupal from scratchDrupal from scratch
Drupal from scratch
 
Open Source Backup Conference 2014: Workshop bareos introduction, by Philipp ...
Open Source Backup Conference 2014: Workshop bareos introduction, by Philipp ...Open Source Backup Conference 2014: Workshop bareos introduction, by Philipp ...
Open Source Backup Conference 2014: Workshop bareos introduction, by Philipp ...
 
HADOOP 실제 구성 사례, Multi-Node 구성
HADOOP 실제 구성 사례, Multi-Node 구성HADOOP 실제 구성 사례, Multi-Node 구성
HADOOP 실제 구성 사례, Multi-Node 구성
 
Set up Hadoop Cluster on Amazon EC2
Set up Hadoop Cluster on Amazon EC2Set up Hadoop Cluster on Amazon EC2
Set up Hadoop Cluster on Amazon EC2
 
Hadoop 20111215
Hadoop 20111215Hadoop 20111215
Hadoop 20111215
 
Hadoop spark performance comparison
Hadoop spark performance comparisonHadoop spark performance comparison
Hadoop spark performance comparison
 
Hadoop meet Rex(How to construct hadoop cluster with rex)
Hadoop meet Rex(How to construct hadoop cluster with rex)Hadoop meet Rex(How to construct hadoop cluster with rex)
Hadoop meet Rex(How to construct hadoop cluster with rex)
 
Hadoop 20111117
Hadoop 20111117Hadoop 20111117
Hadoop 20111117
 
Out of the Box Replication in Postgres 9.4(pgconfsf)
Out of the Box Replication in Postgres 9.4(pgconfsf)Out of the Box Replication in Postgres 9.4(pgconfsf)
Out of the Box Replication in Postgres 9.4(pgconfsf)
 
Install hadoop in a cluster
Install hadoop in a clusterInstall hadoop in a cluster
Install hadoop in a cluster
 
250hadoopinterviewquestions
250hadoopinterviewquestions250hadoopinterviewquestions
250hadoopinterviewquestions
 
Hadoop HDFS
Hadoop HDFS Hadoop HDFS
Hadoop HDFS
 
Empacotamento e backport de aplicações em debian
Empacotamento e backport de aplicações em debianEmpacotamento e backport de aplicações em debian
Empacotamento e backport de aplicações em debian
 

Similaire à Hadoop 2.4 installing on ubuntu 14.04

Single node setup
Single node setupSingle node setup
Single node setup
KBCHOW123
 
R hive tutorial supplement 1 - Installing Hadoop
R hive tutorial supplement 1 - Installing HadoopR hive tutorial supplement 1 - Installing Hadoop
R hive tutorial supplement 1 - Installing Hadoop
Aiden Seonghak Hong
 

Similaire à Hadoop 2.4 installing on ubuntu 14.04 (20)

Configure h base hadoop and hbase client
Configure h base hadoop and hbase clientConfigure h base hadoop and hbase client
Configure h base hadoop and hbase client
 
Hadoop completereference
Hadoop completereferenceHadoop completereference
Hadoop completereference
 
02 Hadoop deployment and configuration
02 Hadoop deployment and configuration02 Hadoop deployment and configuration
02 Hadoop deployment and configuration
 
Hadoop single node installation on ubuntu 14
Hadoop single node installation on ubuntu 14Hadoop single node installation on ubuntu 14
Hadoop single node installation on ubuntu 14
 
Single node setup
Single node setupSingle node setup
Single node setup
 
R hive tutorial supplement 1 - Installing Hadoop
R hive tutorial supplement 1 - Installing HadoopR hive tutorial supplement 1 - Installing Hadoop
R hive tutorial supplement 1 - Installing Hadoop
 
Setting up a HADOOP 2.2 cluster on CentOS 6
Setting up a HADOOP 2.2 cluster on CentOS 6Setting up a HADOOP 2.2 cluster on CentOS 6
Setting up a HADOOP 2.2 cluster on CentOS 6
 
Configuring and manipulating HDFS files
Configuring and manipulating HDFS filesConfiguring and manipulating HDFS files
Configuring and manipulating HDFS files
 
Hadoop installation and Running KMeans Clustering with MapReduce Program on H...
Hadoop installation and Running KMeans Clustering with MapReduce Program on H...Hadoop installation and Running KMeans Clustering with MapReduce Program on H...
Hadoop installation and Running KMeans Clustering with MapReduce Program on H...
 
Hadoop cluster 安裝
Hadoop cluster 安裝Hadoop cluster 安裝
Hadoop cluster 安裝
 
Apache HDFS - Lab Assignment
Apache HDFS - Lab AssignmentApache HDFS - Lab Assignment
Apache HDFS - Lab Assignment
 
Session 03 - Hadoop Installation and Basic Commands
Session 03 - Hadoop Installation and Basic CommandsSession 03 - Hadoop Installation and Basic Commands
Session 03 - Hadoop Installation and Basic Commands
 
Run wordcount job (hadoop)
Run wordcount job (hadoop)Run wordcount job (hadoop)
Run wordcount job (hadoop)
 
Hadoop Installation
Hadoop InstallationHadoop Installation
Hadoop Installation
 
July 2010 Triangle Hadoop Users Group - Chad Vawter Slides
July 2010 Triangle Hadoop Users Group - Chad Vawter SlidesJuly 2010 Triangle Hadoop Users Group - Chad Vawter Slides
July 2010 Triangle Hadoop Users Group - Chad Vawter Slides
 
Run a mapreduce job
Run a mapreduce jobRun a mapreduce job
Run a mapreduce job
 
Mahout Workshop on Google Cloud Platform
Mahout Workshop on Google Cloud PlatformMahout Workshop on Google Cloud Platform
Mahout Workshop on Google Cloud Platform
 
Upgrading hadoop
Upgrading hadoopUpgrading hadoop
Upgrading hadoop
 
Linux basic for CADD biologist
Linux basic for CADD biologistLinux basic for CADD biologist
Linux basic for CADD biologist
 
ACADGILD:: HADOOP LESSON
ACADGILD:: HADOOP LESSON ACADGILD:: HADOOP LESSON
ACADGILD:: HADOOP LESSON
 

Plus de baabtra.com - No. 1 supplier of quality freshers

Plus de baabtra.com - No. 1 supplier of quality freshers (20)

Agile methodology and scrum development
Agile methodology and scrum developmentAgile methodology and scrum development
Agile methodology and scrum development
 
Best coding practices
Best coding practicesBest coding practices
Best coding practices
 
Core java - baabtra
Core java - baabtraCore java - baabtra
Core java - baabtra
 
Acquiring new skills what you should know
Acquiring new skills   what you should knowAcquiring new skills   what you should know
Acquiring new skills what you should know
 
Baabtra.com programming at school
Baabtra.com programming at schoolBaabtra.com programming at school
Baabtra.com programming at school
 
99LMS for Enterprises - LMS that you will love
99LMS for Enterprises - LMS that you will love 99LMS for Enterprises - LMS that you will love
99LMS for Enterprises - LMS that you will love
 
Php sessions & cookies
Php sessions & cookiesPhp sessions & cookies
Php sessions & cookies
 
Php database connectivity
Php database connectivityPhp database connectivity
Php database connectivity
 
Chapter 6 database normalisation
Chapter 6  database normalisationChapter 6  database normalisation
Chapter 6 database normalisation
 
Chapter 5 transactions and dcl statements
Chapter 5  transactions and dcl statementsChapter 5  transactions and dcl statements
Chapter 5 transactions and dcl statements
 
Chapter 4 functions, views, indexing
Chapter 4  functions, views, indexingChapter 4  functions, views, indexing
Chapter 4 functions, views, indexing
 
Chapter 3 stored procedures
Chapter 3 stored proceduresChapter 3 stored procedures
Chapter 3 stored procedures
 
Chapter 2 grouping,scalar and aggergate functions,joins inner join,outer join
Chapter 2  grouping,scalar and aggergate functions,joins   inner join,outer joinChapter 2  grouping,scalar and aggergate functions,joins   inner join,outer join
Chapter 2 grouping,scalar and aggergate functions,joins inner join,outer join
 
Chapter 1 introduction to sql server
Chapter 1 introduction to sql serverChapter 1 introduction to sql server
Chapter 1 introduction to sql server
 
Chapter 1 introduction to sql server
Chapter 1 introduction to sql serverChapter 1 introduction to sql server
Chapter 1 introduction to sql server
 
Microsoft holo lens
Microsoft holo lensMicrosoft holo lens
Microsoft holo lens
 
Blue brain
Blue brainBlue brain
Blue brain
 
5g
5g5g
5g
 
Aptitude skills baabtra
Aptitude skills baabtraAptitude skills baabtra
Aptitude skills baabtra
 
Gd baabtra
Gd baabtraGd baabtra
Gd baabtra
 

Dernier

Dernier (20)

TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path Mount
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 

Hadoop 2.4 installing on ubuntu 14.04

  • 1.
  • 3. Hadoop is a free, Java-based programming framework that supports the processing of large data sets in a distributed computing environment. It is part of the Apache project sponsored by the Apache Software Foundation. In this presentation, we'll install a single-node Hadoop cluster backed by the Hadoop Distributed File System on Ubuntu. HADOOP ON 64 BIT UBUNTU 14.04
  • 4. ➢ INSTALLING JAVA ➢ ADDING A DEDICATED HADOOP USER ➢ INSTALLING SSH ➢ CREATE AND SETUP SSH CERTIFICATES ➢ INSTALL HADOOP ➢ SETUP CONFIGURATION FILES ➢ FORMAT THE NEW HADOOP FILESYSTEM ➢ STARTING HADOOP ➢ STOPPING HADOOP ➢ HADOOP WEB INTERFACES ➢ USING HADOOP STEPS FOR INSTALLING HADOOP
  • 5. ❖ Hadoop framework is written in Java! ➢ Open Terminal in Ubuntu 14.04, and Following the Steps INSTALLING JAVA
  • 6. ADDING A DEDICATED HADOOP USER
  • 7. ➔ SSH has 2 main components : ◆ ssh : The command we use to connect to remote machines - the client. ◆ sshd : The daemon that is running on the server and allows clients to connect to the server. ➔ The ssh is pre-enabled on Linux, but in order to start sshd daemon, we need to install ssh first. Use this command to do that. $ sudo apt-get install ssh INSTALLING SSH
  • 8. ➔ Hadoop requires SSH access to manage its nodes,we therefore need to configure SSH access to localhost. So, we need to have SSH up and running on our machine and configured it to allow SSH public key authentication. CREATE AND SETUP SSH CERTIFICATES
  • 9. ➔ The second command adds the newly created key to the list of authorized keys so that Hadoop can use ssh without prompting for a password. ◆ We can check if ssh works CREATE AND SETUP SSH CERTIFICATES
  • 10. ➔ Type the following Command in Ubuntu Terminal. ➔ We want to move the Hadoop installation to the /usr/local/hadoop directory using the following command INSTALL HADOOP $ wget http://mirrors.sonic.net/apache/hadoop/common/hadoop-2.4.1/hadoop-2.4.1.tar.gz $ tar xvzf hadoop-2.4.1.tar.gz $ sudo mv hadoop-2.4.1 /usr/local/hadoop $ sudo chown -R hduser:hadoop hadoop $ pwd /usr/local/hadoop $ ls bin etc include lib libexec LICENSE.txt NOTICE.txt README.txt sbin share
  • 11. ➔ The following files will have to be modified to complete the Hadoop setup 1. ~/.bashrc 2. /usr/local/hadoop/etc/hadoop/hadoop-env.sh 3. /usr/local/hadoop/etc/hadoop/core-site.xml 4. /usr/local/hadoop/etc/hadoop/mapred-site.xml.template 5. /usr/local/hadoop/etc/hadoop/hdfs-site.xml ➔ First of all we need to find the path where Java has been installed to set the JAVA_HOME environment variable using the following command SETUP CONFIGURATION FILES $ update-alternatives --config java There is only one alternative in link group java (providing /usr/bin/java): /usr/lib/jvm/java-7-openjdk-amd64/jre/bin/java Nothing to configure.
  • 12. 1. ~/.bashrc We can append the following to the end of ~/.bashrc SETUP CONFIGURATION FILES
  • 13. 2. /usr/local/hadoop/etc/hadoop/hadoop-env.sh ★ We need to set JAVA_HOME by modifying hadoop-env.sh file export JAVA_HOME=/usr/lib/jvm/java-7- openjdk-amd64 ★ Adding the above statement in the hadoop-env.sh file ensures that the value of JAVA_HOME variable will be available to Hadoop whenever it is started up. SETUP CONFIGURATION FILES
  • 14. 3. /usr/local/hadoop/etc/hadoop/core-site.xm ★ he /usr/local/hadoop/etc/hadoop/core-site.xml file contains configuration properties that Hadoop uses when starting up. This file can be used to override the default settings that Hadoop starts with. $ sudo mkdir -p /app/hadoop/tmp $ sudo chown hduser:hadoop /app/hadoop/tmp SETUP CONFIGURATION FILES
  • 15. Open the file and enter the following in between the <configuration> </configuration> tag. SETUP CONFIGURATION FILES
  • 16. 4. /usr/local/hadoop/etc/hadoop/mapred-site.xml ➔ By default, the /usr/local/hadoop/etc/hadoop/ folder contains the /usr/local/hadoop/etc/hadoop/mapred-site.xml.template file which has to be renamed/copied with the name mapred-site.xml $ cp/usr/local/hadoop/etc/hadoop/mapred-site.xml.template /usr/local/hadoop/etc/hadoop/mapred-site.xml SETUP CONFIGURATION FILES
  • 17. ➔ The mapred-site.xml file is used to specify which framework is being used for MapReduce. We need to enter the following content in between the <configuration></configuration> tag SETUP CONFIGURATION FILES
  • 18. 5. /usr/local/hadoop/etc/hadoop/hdfs-site.xml ➔ Before editing this file, we need to create two directories which will contain the namenode and the datanode for this Hadoop installation. This can be done using the following commands SETUP CONFIGURATION FILES
  • 19. ➔ Open the file and enter the following content in between the <configuration></configuration> tag <configuration> <property> <name>dfs.replication</name> <value>1</value> <description>Default block replication. The actual number of replications can be specified when the file is created. The default is used if replication is not specified in create time. </description> </property> <property> <name>dfs.namenode.name.dir</name> <value>file:/usr/local/hadoop_store/hdfs/namenode</value> </property> <property> <name>dfs.datanode.data.dir</name> <value>file:/usr/local/hadoop_store/hdfs/datanode</value> </property> </configuration> SETUP CONFIGURATION FILES
  • 20. ➔ Now, the Hadoop filesystem needs to be formatted so that we can start to use it. The format command should be issued with write permission since it creates current directory under /usr/local/hadoop_store/hdfs/namenode folder. hduser@k:~$ hadoop namenode -format DEPRECATED: Use of this script to execute hdfs command is deprecated. Instead use the hdfs command for it. …………………………………………………………………………. …………. ………………….. ……………………. ……….. ………… …………………. …………………… …………………… ……………………………………. ………………. ……….. 14/07/13 22:13:10 INFO namenode.NameNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down NameNode at k/127.0.1.1 ************************************************************/ FORMAT THE NEW HADOOP FILESYSTEM The hadoop namenode -format command should be executed once before we start using Hadoop. If this command is executed again after Hadoop has been used, it'll destroy all the data on the Hadoop file system.
  • 21. ➔ Now it's time to start the newly installed single node cluster. We can use start- all.sh or (start-dfs.sh and start-yarn.sh) hduser@k:/home/k$ start-all.sh This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh 14/07/13 23:36:59 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Starting namenodes on [localhost] ………………. ………………….. ……………….. . ……. ………………………………….. …………………… ……………………………. ……………………………………………………………….. ……………………………….. .. localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-k.out ➔ check if it's really up and running. hduser@k:/home/k$ jps ➔ Another way to check is using netstat hduser@k:/home/k$ netstat -plten | grep java STARTING HADOOP
  • 22. ➔ Type the following Command into the Terminal for Stopping the Hadoop $ pwd /usr/local/hadoop/sbin $ ls distribute-exclude.sh httpfs.sh start-all.sh ----------------------- ------------- ---------- ----------- ------------ -- -- - -- -------- - -- - - -- - - - - - ---- start-secure-dns.sh stop-balancer.sh stop-yarn.sh ➔ We run stop-all.sh or (stop-dfs.sh and stop-yarn.sh) to stop all the daemons running on our machine: $ /usr/local/hadoop/sbin/stop-all.sh STOPPING HADOOP
  • 24. US UK UAE 7002 Hana Road, Edison NJ 08817, United States of America. 90 High Street, Cherry Hinton, Cambridge, CB1 9HZ, United Kingdom. Suite No: 51, Oasis Center, Sheikh Zayed Road, Dubai, UAE Email to info@baabtra.com or Visit baabtra.com Looking for learning more about the above topic?
  • 25. India Centres Emarald Mall (Big Bazar Building) Mavoor Road, Kozhikode, Kerala, India. Ph: + 91 – 495 40 25 550 NC Complex, Near Bus Stand Mukkam, Kozhikode, Kerala, India. Ph: + 91 – 495 40 25 550 Cafit Square IT Park, Hilite Business Park, Kozhikode Kerala, India. Email: info@baabtra.com TBI - NITC NIT Campus, Kozhikode. Kerala, India. Start up Village Eranakulam, Kerala, India. Start up Village UL CC Kozhikode, Kerala
  • 26. Follow us @ twitter.com/baabtra Like us @ facebook.com/baabtra Subscribe to us @ youtube.com/baabtra Become a follower @ slideshare.net/BaabtraMentoringPartner Connect to us @ in.linkedin.com/in/baabtra Give a feedback @ massbaab.com/baabtra Thanks in advance www.baabtra.com | www.massbaab.com |www.baabte.com
  • 27. Want to learn more about programming or Looking to become a good programmer? Are you wasting time on searching so many contents online? Do you want to learn things quickly? Tired of spending huge amount of money to become a Software professional? Do an online course @ baabtra.com We put industry standards to practice. Our structured, activity based courses are so designed to make a quick, good software professional out of anybody who holds a passion for coding.