You are on page 1of 4

Hadoop Single Node Installation ( Hadoop-2.7.1 ) on CENTOS 6.

4
1. Download jdk-8u31 rpm & Install.
# rpm -ivh jdk-8u31-linux-x64.rpm
2. Create normal user for Hadoop installation.
# useradd hduser
# passwd hduser
3. Login with hduser and setup key based ssh.
# ssh-keygen t rsa

# cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys


# chmod 0600 ~/.ssh/authorized_keys
Lets verify key based login. Below command should not ask for password but first time it
will prompt for adding RSA to the list of known hosts.
# ssh localhost

4. Kept JAVA_HOME in hduser bash_profile.

# vi .bash_profile

# . .bash_profile ( Execute bash_profile after updating to apply changes )


# java version

5. Download Hadoop-2.7.1 and kept at hduser home directory.


#
#
#
#

wget http://archive.apache.org/dist/hadoop/core/stable/hadoop-2.7.1.tar.gz
tar xvf Hadoop-2.7.1.tar.gz
mv hadoop-2.7.1 hadoop
chown -R hduser:hduser hadoop ( Changing ower & group permissions )

Configuring Hadoop
6. Setup Environment Variables. First we need to set environment variable uses by
hadoop. Edit ~/.bashrc file and append following values at end of file.
# vi .bashrc
export
export
export
export
export
export
export
export

HADOOP_HOME=/home/hduser/hadoop
HADOOP_INSTALL=$HADOOP_HOME
HADOOP_MAPRED_HOME=$HADOOP_HOME
HADOOP_COMMON_HOME=$HADOOP_HOME
HADOOP_HDFS_HOME=$HADOOP_HOME
YARN_HOME=$HADOOP_HOME
HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin

# . .bashrc (Execute bashrc after updating to apply changes)


Now edit $HADOOP_HOME/etc/hadoop/hadoop-env.sh file and set JAVA_HOME
environment variable
# vi /home/hduser/hadoop/etc/hadoop/hadoop-env.sh

7. Edit Configuration files


Hadoop has many of configuration files, which need to configure as per requirements of
your hadoop infrastructure. Lets start with the configuration with basic hadoop single node
cluster setup. first navigate to below location
# cd /home/hduser/hadoop/etc/hadoop
# vi core-site.xml

Create folder for hadoop datanode & namenode to configure value in hdfs-site.xml

# vi hdfs-site.xml

# vi mapred-site.xml

# vi yarn-site.xml

8. Format Namenode
# hdfs namenode format
Start Hadoop Cluster

Lets start your hadoop cluster using the scripts provides by hadoop. Just navigate to your hadoop
sbin directory and execute script.
# cd $HADOOP_HOME/sbin/
Now run start-all.sh script
# sh start-all.sh
Access Hadoop Services in Browser
Hadoop NameNode started on port 50070 default.
http://192.168.1.241:50070/
Now access port 8088 for getting the information about cluster and all applications
http://192.168.1.241:8088/
Access port 50090 for getting details about secondary namenode.
http://192.168.1.241:50090/
Access port 50075 to get details about DataNode
http://192.168.1.241:50075/

You might also like