SlideShare une entreprise Scribd logo
1  sur  21
A GUIDE TO HADOOP
INSTALLATION
BY WizIQ
Hadoop Installation
• Putty connectivity
• Java Installation (openjdk-7-jdk)
• Group/User Creation &SSH Certification
• Hadoop Install (Hadoop 2.2.0)
• Hadoop configuration
• Hadoop Services
Putty Connectivity
User ununtu@<name>
 Make sure you have SSHAuth
browse to .ppk file(location)
Java Installation
• ubuntu@ip-10-45-133-21:~$ sudo apt-get
install openjdk-7-jdk
• Error : Err http://us-east-
1.ec2.archive.ubuntu.com/ubuntu/ precise-
updates/main liblvm2app2.2 amd64
2.02.66-4ubuntu7.1
403 Forbidden] ……
• Solution: ubuntu@ip-10-45-133-21:~$
sudo apt-get update
ubuntu@ip-10-45-133-21:~$ sudo apt-get
install openjdk-7-jdk
ubuntu@ip-10-45-133-21:~$ cd /usr/lib/jvm
• Error:
ubuntu@ip-10-45-133-21:/usr/lib/jvm$ ln -s
java-7-openjdk-amd64 jdk
ln: failed to create symbolic link `jdk':
Permission denied
ubuntu@ip-10-45-133-21:/usr/lib/jvm$
• Solution:
ubuntu@ip-10-45-133-21:/usr/lib/jvm$sudo su
root@ip-10-45-133-21:/usr/lib/jvm# ln -s java-
7-openjdk-amd64 jdk
root@ip-10-45-133-21:/usr/lib/jvm# sudo apt-get install openssh-server
Group/User Creation &SSH Certification
• ubuntu@ip-10-45-133-21$ sudo addgroup
hadoop
• ubuntu@ip-10-45-133-21$ sudo adduser
--ingroup hadoop hduser
• ubuntu@ip-10-45-133-21$ sudo adduser
hduser sudo
**After user is created, re-login into ubuntu
using hduser
ubuntu@ip-10-45-133-21:~$ su –l hduser
Password: Setup SSH Certificate
hduser@ip-10-45-133-21$ ssh-keygen -t rsa -P ''
...
Your identification has been saved in
/home/hduser/.ssh/id_rsa.
Your public key has been saved in
/home/hduser/.ssh/id_rsa.pub.
...
hduser@ip-10-45-133-21$ cat ~/.ssh/id_rsa.pub
>> ~/.ssh/authorized_keys
hduser@ip-10-45-133-21$ ssh localhost
**
**Note: I have ignored the password here for %id_rsa:
Hadoop Install(Hadoop 2.2.0)
• hduser@ip-10-45-133-21:~# su -l hduser
• hduser@ip-10-45-133-21$ cd ~
• hduser@ip-10-45-133-21$ wget
http://www.trieuvan.com/apache/hadoop/c
ommon/hadoop- hduser@ip-10-45-133-
212.2.0/hadoop-2.2.0.tar.gz
• hduser@ip-10-45-133-21$ sudo tar vxzf
hadoop-2.2.0.tar.gz -C /usr/local
• hduser@ip-10-45-133-21$ cd /usr/local
• hduser@ip-10-45-133-21$ sudo mv hadoop-
2.2.0 hadoop
• hduser@ip-10-45-133-21$ sudo chown -R
hduser:hadoop hadoop
Hadoop Configuration
• Setup Hadoop Environment Variables:
• hduser@ip-10-45-133-21$cd ~
• hduser@ip-10-45-133-21$vi .bashrc
Copy & Paste following to the end of the file:
#Hadoop variables
export JAVA_HOME=/usr/lib/jvm/jdk/
export HADOOP_INSTALL=/usr/local/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
###end of paste
hduser@ip-10-45-133-21$ cd /usr/local/hadoop/etc/hadoop
hduser@ip-10-45-133-21$ vi hadoop-env.sh
#modify JAVA_HOME
export JAVA_HOME=/usr/lib/jvm/jdk/
****Re-login into Ubuntu using hdser and
check hadoop version******
hduser@ip-10-45-133-21$ hadoop version
Hadoop 2.2.0
Subversion
https://svn.apache.org/repos/asf/hadoop/
common -r 1529768
Compiled by hortonmu on 2013-10-
07T06:28Z
Compiled with protoc 2.5.0
From source with checksum
79e53ce7994d1628b240f09af91e1af4
This command was run using
/usr/local/hadoop-
2.2.0/share/hadoop/common/hadoop-
common-2.2.0.jar
At this point, Hadoop is installed.
Configure Hadoop :
hduser@ip-10-45-133-21$ cd /usr/local/hadoop/etc/hadoop
hduser@ip-10-45-133-21$ vi core-site.xml
#Paste following between <configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
hduser@ip-10-45-133-21$ vi yarn-site.xml
#Paste following between <configuration>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
<value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>
hduser@ip-10-45-133-21$ mv mapred-site.xml.template mapred-site.xml
hduser@ip-10-45-133-21$ vi mapred-site.xml
• #Paste following between <configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
hduser@ip-10-45-133-21$ cd ~
hduser@ip-10-45-133-21$ mkdir -p mydata/hdfs/namenode
hduser@ip-10-45-133-21$ mkdir -p mydata/hdfs/datanode
hduser@ip-10-45-133-21$ cd /usr/local/hadoop/etc/hadoop
hduser@ip-10-45-133-21$ vi hdfs-site.xml
• #Paste following between <configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>file:/home/hduser/mydata/hdfs/namenode</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file:/home/hduser/mydata/hdfs/datanode</value>
</property>
Format Namenode :
hduser@ip-10-45-133-21$ hdfs namenode -format
Hadoop Services
hduser@ip-10-45-133-21$ start-dfs.sh
....
hduser@ip-10-45-133-21$ start-yarn.sh
....
Error:
Tried checking again:
1) SSH Certification creation
2) ssh localhost
3) vi hadoop-env.sh
4) vi hdfs-site.xml
5) Namenode and datanode directories
6) reformatted namenode
7) Started the services
That’s it after that it started working as below screens. Not sure what fixed the issue
as observed in next slide
• Error1: WARN util.NativeCodeLoader: Unable to load native-hadoop library for
your platform... using builtin-java classes where applicable ….
• Error2: Localhost: Permission Denied (Publickey)…
• hduser@ip-10-45-133-21:~$ jps
• All went on successfully and could see
below services running.
• Reconnected and works fine
THANK YOU!
Enroll now for Hadoop and Big Data Training @ WizIQ.com
Visit: http://www.wiziq.com/course/21308-hadoop-big-data-training
For more information, feel free to contact us at courses@wiziq.com

Contenu connexe

Tendances

Tendances (20)

Hadoop 3.1.1 single node
Hadoop 3.1.1 single nodeHadoop 3.1.1 single node
Hadoop 3.1.1 single node
 
HADOOP 실제 구성 사례, Multi-Node 구성
HADOOP 실제 구성 사례, Multi-Node 구성HADOOP 실제 구성 사례, Multi-Node 구성
HADOOP 실제 구성 사례, Multi-Node 구성
 
Configuration Surgery with Augeas
Configuration Surgery with AugeasConfiguration Surgery with Augeas
Configuration Surgery with Augeas
 
How to create a secured multi tenancy for clustered ML with JupyterHub
How to create a secured multi tenancy for clustered ML with JupyterHubHow to create a secured multi tenancy for clustered ML with JupyterHub
How to create a secured multi tenancy for clustered ML with JupyterHub
 
How to go the extra mile on monitoring
How to go the extra mile on monitoringHow to go the extra mile on monitoring
How to go the extra mile on monitoring
 
Setting up a HADOOP 2.2 cluster on CentOS 6
Setting up a HADOOP 2.2 cluster on CentOS 6Setting up a HADOOP 2.2 cluster on CentOS 6
Setting up a HADOOP 2.2 cluster on CentOS 6
 
Light my-fuse
Light my-fuseLight my-fuse
Light my-fuse
 
Out of the box replication in postgres 9.4
Out of the box replication in postgres 9.4Out of the box replication in postgres 9.4
Out of the box replication in postgres 9.4
 
Puppet: Eclipsecon ALM 2013
Puppet: Eclipsecon ALM 2013Puppet: Eclipsecon ALM 2013
Puppet: Eclipsecon ALM 2013
 
Hadoop 2.4 installing on ubuntu 14.04
Hadoop 2.4 installing on ubuntu 14.04Hadoop 2.4 installing on ubuntu 14.04
Hadoop 2.4 installing on ubuntu 14.04
 
Docker and friends at Linux Days 2014 in Prague
Docker and friends at Linux Days 2014 in PragueDocker and friends at Linux Days 2014 in Prague
Docker and friends at Linux Days 2014 in Prague
 
Putting some "logic" in LVM.
Putting some "logic" in LVM.Putting some "logic" in LVM.
Putting some "logic" in LVM.
 
Recipe of a linux Live CD (archived)
Recipe of a linux Live CD (archived)Recipe of a linux Live CD (archived)
Recipe of a linux Live CD (archived)
 
Introduction to Stacki at Atlanta Meetup February 2016
Introduction to Stacki at Atlanta Meetup February 2016Introduction to Stacki at Atlanta Meetup February 2016
Introduction to Stacki at Atlanta Meetup February 2016
 
Making Your Capistrano Recipe Book
Making Your Capistrano Recipe BookMaking Your Capistrano Recipe Book
Making Your Capistrano Recipe Book
 
A Journey to Boot Linux on Raspberry Pi
A Journey to Boot Linux on Raspberry PiA Journey to Boot Linux on Raspberry Pi
A Journey to Boot Linux on Raspberry Pi
 
Ansible ex407 and EX 294
Ansible ex407 and EX 294Ansible ex407 and EX 294
Ansible ex407 and EX 294
 
Software Packaging for Cross OS Distribution
Software Packaging for Cross OS DistributionSoftware Packaging for Cross OS Distribution
Software Packaging for Cross OS Distribution
 
Automated infrastructure is on the menu
Automated infrastructure is on the menuAutomated infrastructure is on the menu
Automated infrastructure is on the menu
 
[ETHCon Korea 2019] Shin mansun 신만선
[ETHCon Korea 2019] Shin mansun 신만선[ETHCon Korea 2019] Shin mansun 신만선
[ETHCon Korea 2019] Shin mansun 신만선
 

Similaire à Hadoop Installation

mapserver_install_linux
mapserver_install_linuxmapserver_install_linux
mapserver_install_linux
tutorialsruby
 
mapserver_install_linux
mapserver_install_linuxmapserver_install_linux
mapserver_install_linux
tutorialsruby
 
mapserver_install_linux
mapserver_install_linuxmapserver_install_linux
mapserver_install_linux
tutorialsruby
 
mapserver_install_linux
mapserver_install_linuxmapserver_install_linux
mapserver_install_linux
tutorialsruby
 
Setup and run hadoop distrubution file system example 2.2
Setup and run hadoop  distrubution file system example  2.2Setup and run hadoop  distrubution file system example  2.2
Setup and run hadoop distrubution file system example 2.2
Mounir Benhalla
 

Similaire à Hadoop Installation (20)

Hadoop installation on windows
Hadoop installation on windows Hadoop installation on windows
Hadoop installation on windows
 
Hadoop single node installation on ubuntu 14
Hadoop single node installation on ubuntu 14Hadoop single node installation on ubuntu 14
Hadoop single node installation on ubuntu 14
 
Hadoop completereference
Hadoop completereferenceHadoop completereference
Hadoop completereference
 
Session 03 - Hadoop Installation and Basic Commands
Session 03 - Hadoop Installation and Basic CommandsSession 03 - Hadoop Installation and Basic Commands
Session 03 - Hadoop Installation and Basic Commands
 
installation of hadoop on ubuntu.pptx
installation of hadoop on ubuntu.pptxinstallation of hadoop on ubuntu.pptx
installation of hadoop on ubuntu.pptx
 
Hadoop installation
Hadoop installationHadoop installation
Hadoop installation
 
mapserver_install_linux
mapserver_install_linuxmapserver_install_linux
mapserver_install_linux
 
mapserver_install_linux
mapserver_install_linuxmapserver_install_linux
mapserver_install_linux
 
mapserver_install_linux
mapserver_install_linuxmapserver_install_linux
mapserver_install_linux
 
mapserver_install_linux
mapserver_install_linuxmapserver_install_linux
mapserver_install_linux
 
Setup and run hadoop distrubution file system example 2.2
Setup and run hadoop  distrubution file system example  2.2Setup and run hadoop  distrubution file system example  2.2
Setup and run hadoop distrubution file system example 2.2
 
Hadoop 2.0 cluster setup on ubuntu 14.04 (64 bit)
Hadoop 2.0 cluster setup on ubuntu 14.04 (64 bit)Hadoop 2.0 cluster setup on ubuntu 14.04 (64 bit)
Hadoop 2.0 cluster setup on ubuntu 14.04 (64 bit)
 
02 Hadoop deployment and configuration
02 Hadoop deployment and configuration02 Hadoop deployment and configuration
02 Hadoop deployment and configuration
 
Hadoop cluster 安裝
Hadoop cluster 安裝Hadoop cluster 安裝
Hadoop cluster 安裝
 
Hadoop installation
Hadoop installationHadoop installation
Hadoop installation
 
Installing odoo v8 from github
Installing odoo v8 from githubInstalling odoo v8 from github
Installing odoo v8 from github
 
Installing GravCMS
Installing GravCMSInstalling GravCMS
Installing GravCMS
 
Edup wifi for raspberry zero
Edup wifi  for raspberry zeroEdup wifi  for raspberry zero
Edup wifi for raspberry zero
 
ERP System Implementation Kubernetes Cluster with Sticky Sessions
ERP System Implementation Kubernetes Cluster with Sticky Sessions ERP System Implementation Kubernetes Cluster with Sticky Sessions
ERP System Implementation Kubernetes Cluster with Sticky Sessions
 
Continuous delivery with docker
Continuous delivery with dockerContinuous delivery with docker
Continuous delivery with docker
 

Plus de mrinalsingh385

Plus de mrinalsingh385 (7)

OBIEE Interview Questions
OBIEE Interview QuestionsOBIEE Interview Questions
OBIEE Interview Questions
 
OBIEE 11G TRAINING PROGRAM
OBIEE 11G TRAINING PROGRAMOBIEE 11G TRAINING PROGRAM
OBIEE 11G TRAINING PROGRAM
 
Project management professional certification Tips
Project management professional certification TipsProject management professional certification Tips
Project management professional certification Tips
 
Pmp new1
Pmp new1Pmp new1
Pmp new1
 
Nine keys to successful delegation in Project Management
Nine keys to successful delegation in Project ManagementNine keys to successful delegation in Project Management
Nine keys to successful delegation in Project Management
 
ITIL FOUNDATION COURSE
ITIL FOUNDATION COURSEITIL FOUNDATION COURSE
ITIL FOUNDATION COURSE
 
ITIL
ITILITIL
ITIL
 

Dernier

Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
ciinovamais
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
negromaestrong
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
heathfieldcps1
 
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
ZurliaSoop
 

Dernier (20)

Magic bus Group work1and 2 (Team 3).pptx
Magic bus Group work1and 2 (Team 3).pptxMagic bus Group work1and 2 (Team 3).pptx
Magic bus Group work1and 2 (Team 3).pptx
 
Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...Making communications land - Are they received and understood as intended? we...
Making communications land - Are they received and understood as intended? we...
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
How to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POSHow to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POS
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
 
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxSKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
 
ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
SOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning PresentationSOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning Presentation
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptx
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 

Hadoop Installation

  • 1. A GUIDE TO HADOOP INSTALLATION BY WizIQ
  • 2. Hadoop Installation • Putty connectivity • Java Installation (openjdk-7-jdk) • Group/User Creation &SSH Certification • Hadoop Install (Hadoop 2.2.0) • Hadoop configuration • Hadoop Services
  • 3. Putty Connectivity User ununtu@<name>  Make sure you have SSHAuth browse to .ppk file(location)
  • 4.
  • 5. Java Installation • ubuntu@ip-10-45-133-21:~$ sudo apt-get install openjdk-7-jdk • Error : Err http://us-east- 1.ec2.archive.ubuntu.com/ubuntu/ precise- updates/main liblvm2app2.2 amd64 2.02.66-4ubuntu7.1 403 Forbidden] …… • Solution: ubuntu@ip-10-45-133-21:~$ sudo apt-get update ubuntu@ip-10-45-133-21:~$ sudo apt-get install openjdk-7-jdk ubuntu@ip-10-45-133-21:~$ cd /usr/lib/jvm • Error: ubuntu@ip-10-45-133-21:/usr/lib/jvm$ ln -s java-7-openjdk-amd64 jdk ln: failed to create symbolic link `jdk': Permission denied ubuntu@ip-10-45-133-21:/usr/lib/jvm$ • Solution: ubuntu@ip-10-45-133-21:/usr/lib/jvm$sudo su root@ip-10-45-133-21:/usr/lib/jvm# ln -s java- 7-openjdk-amd64 jdk
  • 7. Group/User Creation &SSH Certification • ubuntu@ip-10-45-133-21$ sudo addgroup hadoop • ubuntu@ip-10-45-133-21$ sudo adduser --ingroup hadoop hduser • ubuntu@ip-10-45-133-21$ sudo adduser hduser sudo **After user is created, re-login into ubuntu using hduser ubuntu@ip-10-45-133-21:~$ su –l hduser Password: Setup SSH Certificate hduser@ip-10-45-133-21$ ssh-keygen -t rsa -P '' ... Your identification has been saved in /home/hduser/.ssh/id_rsa. Your public key has been saved in /home/hduser/.ssh/id_rsa.pub. ... hduser@ip-10-45-133-21$ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys hduser@ip-10-45-133-21$ ssh localhost **
  • 8. **Note: I have ignored the password here for %id_rsa:
  • 9. Hadoop Install(Hadoop 2.2.0) • hduser@ip-10-45-133-21:~# su -l hduser • hduser@ip-10-45-133-21$ cd ~ • hduser@ip-10-45-133-21$ wget http://www.trieuvan.com/apache/hadoop/c ommon/hadoop- hduser@ip-10-45-133- 212.2.0/hadoop-2.2.0.tar.gz • hduser@ip-10-45-133-21$ sudo tar vxzf hadoop-2.2.0.tar.gz -C /usr/local • hduser@ip-10-45-133-21$ cd /usr/local • hduser@ip-10-45-133-21$ sudo mv hadoop- 2.2.0 hadoop • hduser@ip-10-45-133-21$ sudo chown -R hduser:hadoop hadoop
  • 10. Hadoop Configuration • Setup Hadoop Environment Variables: • hduser@ip-10-45-133-21$cd ~ • hduser@ip-10-45-133-21$vi .bashrc Copy & Paste following to the end of the file: #Hadoop variables export JAVA_HOME=/usr/lib/jvm/jdk/ export HADOOP_INSTALL=/usr/local/hadoop export PATH=$PATH:$HADOOP_INSTALL/bin export PATH=$PATH:$HADOOP_INSTALL/sbin export HADOOP_MAPRED_HOME=$HADOOP_INSTALL export HADOOP_COMMON_HOME=$HADOOP_INSTALL export HADOOP_HDFS_HOME=$HADOOP_INSTALL export YARN_HOME=$HADOOP_INSTALL ###end of paste hduser@ip-10-45-133-21$ cd /usr/local/hadoop/etc/hadoop hduser@ip-10-45-133-21$ vi hadoop-env.sh #modify JAVA_HOME export JAVA_HOME=/usr/lib/jvm/jdk/
  • 11. ****Re-login into Ubuntu using hdser and check hadoop version****** hduser@ip-10-45-133-21$ hadoop version Hadoop 2.2.0 Subversion https://svn.apache.org/repos/asf/hadoop/ common -r 1529768 Compiled by hortonmu on 2013-10- 07T06:28Z Compiled with protoc 2.5.0 From source with checksum 79e53ce7994d1628b240f09af91e1af4 This command was run using /usr/local/hadoop- 2.2.0/share/hadoop/common/hadoop- common-2.2.0.jar At this point, Hadoop is installed.
  • 12. Configure Hadoop : hduser@ip-10-45-133-21$ cd /usr/local/hadoop/etc/hadoop hduser@ip-10-45-133-21$ vi core-site.xml
  • 13. #Paste following between <configuration> <property> <name>fs.default.name</name> <value>hdfs://localhost:9000</value> </property> hduser@ip-10-45-133-21$ vi yarn-site.xml #Paste following between <configuration> <property> <name>yarn.nodemanager.aux-services</name> <value>mapreduce_shuffle</value> </property> <property> <name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name> <value>org.apache.hadoop.mapred.ShuffleHandler</value> </property> hduser@ip-10-45-133-21$ mv mapred-site.xml.template mapred-site.xml hduser@ip-10-45-133-21$ vi mapred-site.xml
  • 14. • #Paste following between <configuration> <property> <name>mapreduce.framework.name</name> <value>yarn</value> </property> hduser@ip-10-45-133-21$ cd ~ hduser@ip-10-45-133-21$ mkdir -p mydata/hdfs/namenode hduser@ip-10-45-133-21$ mkdir -p mydata/hdfs/datanode hduser@ip-10-45-133-21$ cd /usr/local/hadoop/etc/hadoop hduser@ip-10-45-133-21$ vi hdfs-site.xml • #Paste following between <configuration> <property> <name>dfs.replication</name> <value>1</value> </property> <property> <name>dfs.namenode.name.dir</name> <value>file:/home/hduser/mydata/hdfs/namenode</value> </property> <property> <name>dfs.datanode.data.dir</name> <value>file:/home/hduser/mydata/hdfs/datanode</value> </property>
  • 17. Tried checking again: 1) SSH Certification creation 2) ssh localhost 3) vi hadoop-env.sh 4) vi hdfs-site.xml 5) Namenode and datanode directories 6) reformatted namenode 7) Started the services That’s it after that it started working as below screens. Not sure what fixed the issue as observed in next slide • Error1: WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable …. • Error2: Localhost: Permission Denied (Publickey)…
  • 18.
  • 19. • hduser@ip-10-45-133-21:~$ jps • All went on successfully and could see below services running.
  • 20. • Reconnected and works fine
  • 21. THANK YOU! Enroll now for Hadoop and Big Data Training @ WizIQ.com Visit: http://www.wiziq.com/course/21308-hadoop-big-data-training For more information, feel free to contact us at courses@wiziq.com