Вы находитесь на странице: 1из 68

Konfigurasi Hadoop Single

Node Cluster Pada Linux


Ubuntu
(By: Imam Cholissodin)
Pada single cluster ini, hanya digunakan PC master saja.
Langkah-langkahnya sebagai berikut:
1. Siapkan Virtual Machine Ubuntu, misal di Virtual Box, dengan
HDD 40 GB dan RAM 512 MB 1024 MB.
2. Membuat PC master, dengan spesifikasi konfigurasi sebagai
berikut:
a. Buka terminal dan ketikkan sudo nano /etc/hosts
b. sudo apt-get update
c. sudo apt-get install default-jdk (cek dengan java -version)
d. sudo addgroup hadoop
e. sudo adduser ingroup hadoop hduser
f. sudo adduser hduser sudo
g. sudo apt-get install ssh
h. su hduser
i. ssh-keygen -t rsa -P ""
j. Ketikkan cat $HOME/.ssh/id_rsa.pub >>
$HOME/.ssh/authorized_keys
k. wget http://mirror.wanxp.id/apache/hadoop/common/hadoop2.7.3/hadoop-2.7.3.tar.gz
l. hduser@Master:~$ sudo tar xvzf hadoop-2.7.3.tar.gz
m. hduser@Master:~$ sudo mv hadoop-2.7.3 /usr/local/hadoop
jika terdapat error hduser is not in the sudoers file. This
incident will be reported., maka
hduser@Master:~$ exit
nidos@Master:~$ sudo adduser hduser sudo
[sudo] password for nidos:
Adding user `hduser' to group `sudo' ...
Adding user hduser to group sudo
Done.
nidos@Master:~$ su hduser
hduser@Master:/home/nidos$ cd
hduser@Master:~$ sudo mv hadoop-2.7.3 /usr/local/hadoop

Gambar 1 hadoop pada Linux /usr/local/hadoop


hduser@Master:~$ sudo nano ~/.bashrc
pada line terakhir, tambahkan berikut:

export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
export HADOOP_INSTALL=/usr/local/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export
HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTA
LL/lib/native
export
HADOOP_OPTS="Djava.library.path=$HADOOP_INSTALL/lib"
"
hduser@Master:~$ source ~/.bashrc
hduser@Master:~$
sudo
/usr/local/hadoop/etc/hadoop/hadoop-env.sh

nano

ubah menjadi:
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
hduser@Master:~$
sudo
/usr/local/hadoop/etc/hadoop/core-site.xml

<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>

nano

hduser@Master:~$ cp /usr/local/hadoop/etc/hadoop/mapredsite.xml.template
/usr/local/hadoop/etc/hadoop/mapredsite.xml
hduser@Master:~$
sudo
/usr/local/hadoop/etc/hadoop/mapred-site.xml

nano

<configuration>
<property>
<name>mapred.job.tracker</name>
<value>localhost:54311</value>
</property>
</configuration>

hduser@Master:~$
sudo
/usr/local/hadoop_tmp/hdfs/namenode

mkdir

-p

hduser@Master:~$
sudo
/usr/local/hadoop_tmp/hdfs/datanode

mkdir

-p

-R

hduser

hduser@Master:~$
/usr/local/hadoop_tmp

sudo

chown

Gambar 1 Buat namenode dan datanode


3

hduser@Master:~$
sudo
/usr/local/hadoop/etc/hadoop/hdfs-site.xml

nano

<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>file:/usr/local/hadoop_tmp/hdfs/namenode</valu
e>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file:/usr/local/hadoop_tmp/hdfs/datanode</value
>
</property>
</configuration>

hduser@Master:~$ hdfs namenode format


hduser@Master:~$hdfsnamenodeformat
hadoop2.7.3/share/hadoop/yarn/lib/javax.inject1.jar
hadoop2.7.3/share/hadoop/yarn/lib/aopalliance
1.0.jar
hadoop2.7.3/share/hadoop/yarn/lib/commonsio2.4.jar
hadoop2.7.3/share/hadoop/yarn/lib/jerseyserver
1.9.jar
hadoop2.7.3/share/hadoop/yarn/lib/asm3.2.jar
hadoop2.7.3/share/hadoop/yarn/lib/jerseyjson
1.9.jar
hadoop2.7.3/share/hadoop/yarn/lib/jettison1.1.jar
hadoop2.7.3/share/hadoop/yarn/lib/jaxbimpl2.2.3
1.jar
hadoop2.7.3/share/hadoop/yarn/lib/jerseyguice
1.9.jar
hadoop2.7.3/share/hadoop/yarn/lib/zookeeper
3.4.6.jar
hadoop2.7.3/share/hadoop/yarn/lib/netty
3.6.2.Final.jar

hadoop2.7.3/share/hadoop/yarn/lib/leveldbjniall
1.8.jar
hadoop2.7.3/share/hadoop/yarn/lib/commons
collections3.2.2.jar
hadoop2.7.3/share/hadoop/yarn/lib/jetty6.1.26.jar
hadoop2.7.3/share/hadoop/yarn/test/
hadoop2.7.3/share/hadoop/yarn/test/hadoopyarn
servertests2.7.3tests.jar
hadoop2.7.3/share/hadoop/yarn/hadoopyarnapi
2.7.3.jar
hadoop2.7.3/share/hadoop/yarn/hadoopyarncommon
2.7.3.jar
hadoop2.7.3/share/hadoop/yarn/hadoopyarnserver
common2.7.3.jar
hadoop2.7.3/share/hadoop/yarn/hadoopyarnserver
nodemanager2.7.3.jar
hadoop2.7.3/share/hadoop/yarn/hadoopyarnserver
webproxy2.7.3.jar
hadoop2.7.3/share/hadoop/yarn/hadoopyarnserver
applicationhistoryservice2.7.3.jar
hadoop2.7.3/share/hadoop/yarn/hadoopyarnserver
resourcemanager2.7.3.jar
hadoop2.7.3/share/hadoop/yarn/hadoopyarnserver
tests2.7.3.jar
hadoop2.7.3/share/hadoop/yarn/hadoopyarnclient
2.7.3.jar
hadoop2.7.3/share/hadoop/yarn/hadoopyarnserver
sharedcachemanager2.7.3.jar
hadoop2.7.3/share/hadoop/yarn/hadoopyarn
applicationsdistributedshell2.7.3.jar
hadoop2.7.3/share/hadoop/yarn/hadoopyarn
applicationsunmanagedamlauncher2.7.3.jar
hadoop2.7.3/share/hadoop/yarn/hadoopyarnregistry
2.7.3.jar
hadoop2.7.3/share/hadoop/mapreduce/
hadoop2.7.3/share/hadoop/mapreduce/sources/
hadoop2.7.3/share/hadoop/mapreduce/sources/hadoop
mapreduceclientapp2.7.3sources.jar
hadoop2.7.3/share/hadoop/mapreduce/sources/hadoop
mapreduceclientapp2.7.3testsources.jar
hadoop2.7.3/share/hadoop/mapreduce/sources/hadoop
mapreduceclientcommon2.7.3sources.jar
hadoop2.7.3/share/hadoop/mapreduce/sources/hadoop
mapreduceclientcommon2.7.3testsources.jar
hadoop2.7.3/share/hadoop/mapreduce/sources/hadoop
mapreduceclientcore2.7.3sources.jar
hadoop2.7.3/share/hadoop/mapreduce/sources/hadoop
mapreduceclientcore2.7.3testsources.jar
hadoop2.7.3/share/hadoop/mapreduce/sources/hadoop
mapreduceclienths2.7.3sources.jar

hadoop2.7.3/share/hadoop/mapreduce/sources/hadoop
mapreduceclienths2.7.3testsources.jar
hadoop2.7.3/share/hadoop/mapreduce/sources/hadoop
mapreduceclienthsplugins2.7.3sources.jar
hadoop2.7.3/share/hadoop/mapreduce/sources/hadoop
mapreduceclienthsplugins2.7.3testsources.jar
hadoop2.7.3/share/hadoop/mapreduce/sources/hadoop
mapreduceclientjobclient2.7.3sources.jar
hadoop2.7.3/share/hadoop/mapreduce/sources/hadoop
mapreduceclientjobclient2.7.3testsources.jar
hadoop2.7.3/share/hadoop/mapreduce/sources/hadoop
mapreduceclientshuffle2.7.3sources.jar
hadoop2.7.3/share/hadoop/mapreduce/sources/hadoop
mapreduceclientshuffle2.7.3testsources.jar
hadoop2.7.3/share/hadoop/mapreduce/sources/hadoop
mapreduceexamples2.7.3sources.jar
hadoop2.7.3/share/hadoop/mapreduce/sources/hadoop
mapreduceexamples2.7.3testsources.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/
hadoop2.7.3/share/hadoop/mapreduce/lib/protobuf
java2.5.0.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/avro
1.7.4.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/jacksoncore
asl1.9.13.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/jackson
mapperasl1.9.13.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/paranamer
2.3.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/snappyjava
1.0.4.1.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/commons
compress1.4.1.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/xz1.0.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/hadoop
annotations2.7.3.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/commonsio
2.4.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/jerseycore
1.9.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/jersey
server1.9.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/asm3.2.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/log4j
1.2.17.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/netty
3.6.2.Final.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/leveldbjni
all1.8.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/guice3.0.jar

hadoop2.7.3/share/hadoop/mapreduce/lib/javax.inject
1.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/aopalliance
1.0.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/jerseyguice
1.9.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/guice
servlet3.0.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/junit
4.11.jar
hadoop2.7.3/share/hadoop/mapreduce/lib/hamcrest
core1.3.jar
hadoop2.7.3/share/hadoop/mapreduce/libexamples/
hadoop2.7.3/share/hadoop/mapreduce/lib
examples/hsqldb2.0.0.jar
hadoop2.7.3/share/hadoop/mapreduce/hadoopmapreduce
clientcore2.7.3.jar
hadoop2.7.3/share/hadoop/mapreduce/hadoopmapreduce
clientcommon2.7.3.jar
hadoop2.7.3/share/hadoop/mapreduce/hadoopmapreduce
clientshuffle2.7.3.jar
hadoop2.7.3/share/hadoop/mapreduce/hadoopmapreduce
clientapp2.7.3.jar
hadoop2.7.3/share/hadoop/mapreduce/hadoopmapreduce
clienths2.7.3.jar
hadoop2.7.3/share/hadoop/mapreduce/hadoopmapreduce
clientjobclient2.7.3.jar
hadoop2.7.3/share/hadoop/mapreduce/hadoopmapreduce
clienthsplugins2.7.3.jar
hadoop2.7.3/share/hadoop/mapreduce/hadoopmapreduce
examples2.7.3.jar
hadoop2.7.3/share/hadoop/mapreduce/hadoopmapreduce
clientjobclient2.7.3tests.jar
hadoop2.7.3/share/hadoop/tools/
hadoop2.7.3/share/hadoop/tools/sources/
hadoop2.7.3/share/hadoop/tools/sources/hadoop
archives2.7.3sources.jar
hadoop2.7.3/share/hadoop/tools/sources/hadoop
archives2.7.3testsources.jar
hadoop2.7.3/share/hadoop/tools/sources/hadoop
datajoin2.7.3sources.jar
hadoop2.7.3/share/hadoop/tools/sources/hadoop
datajoin2.7.3testsources.jar
hadoop2.7.3/share/hadoop/tools/sources/hadoop
distcp2.7.3sources.jar
hadoop2.7.3/share/hadoop/tools/sources/hadoop
distcp2.7.3testsources.jar
hadoop2.7.3/share/hadoop/tools/sources/hadoop
extras2.7.3sources.jar
hadoop2.7.3/share/hadoop/tools/sources/hadoop

extras2.7.3testsources.jar
hadoop2.7.3/share/hadoop/tools/sources/hadoop
gridmix2.7.3sources.jar
hadoop2.7.3/share/hadoop/tools/sources/hadoop
gridmix2.7.3testsources.jar
hadoop2.7.3/share/hadoop/tools/sources/hadooprumen
2.7.3sources.jar
hadoop2.7.3/share/hadoop/tools/sources/hadooprumen
2.7.3testsources.jar
hadoop2.7.3/share/hadoop/tools/sources/hadoop
streaming2.7.3sources.jar
hadoop2.7.3/share/hadoop/tools/sources/hadoop
streaming2.7.3testsources.jar
hadoop2.7.3/share/hadoop/tools/sources/hadoopsls
2.7.3sources.jar
hadoop2.7.3/share/hadoop/tools/sources/hadoopsls
2.7.3testsources.jar
hadoop2.7.3/share/hadoop/tools/sls/
hadoop2.7.3/share/hadoop/tools/sls/bin/
hadoop2.7.3/share/hadoop/tools/sls/bin/rumen2sls.sh
hadoop2.7.3/share/hadoop/tools/sls/bin/slsrun.sh
hadoop2.7.3/share/hadoop/tools/sls/html/
hadoop2.7.3/share/hadoop/tools/sls/html/css/
hadoop
2.7.3/share/hadoop/tools/sls/html/css/bootstrap
responsive.min.css
hadoop
2.7.3/share/hadoop/tools/sls/html/css/bootstrap.min.css
hadoop2.7.3/share/hadoop/tools/sls/html/js/
hadoop
2.7.3/share/hadoop/tools/sls/html/js/thirdparty/
hadoop
2.7.3/share/hadoop/tools/sls/html/js/thirdparty/bootstrap.m
in.js
hadoop
2.7.3/share/hadoop/tools/sls/html/js/thirdparty/d3LICENSE
hadoop
2.7.3/share/hadoop/tools/sls/html/js/thirdparty/d3.v3.js
hadoop
2.7.3/share/hadoop/tools/sls/html/js/thirdparty/jquery.js
hadoop
2.7.3/share/hadoop/tools/sls/html/showSimulationTrace.html
hadoop
2.7.3/share/hadoop/tools/sls/html/simulate.html.template
hadoop
2.7.3/share/hadoop/tools/sls/html/simulate.info.html.templa
te
hadoop
2.7.3/share/hadoop/tools/sls/html/track.html.template
hadoop2.7.3/share/hadoop/tools/sls/sampleconf/

hadoop2.7.3/share/hadoop/tools/sls/sample
conf/capacityscheduler.xml
hadoop2.7.3/share/hadoop/tools/sls/sampleconf/fair
scheduler.xml
hadoop2.7.3/share/hadoop/tools/sls/sample
conf/log4j.properties
hadoop2.7.3/share/hadoop/tools/sls/sampleconf/sls
runner.xml
hadoop2.7.3/share/hadoop/tools/sls/sampleconf/yarn
site.xml
hadoop2.7.3/share/hadoop/tools/sls/sampledata/
hadoop2.7.3/share/hadoop/tools/sls/sample
data/2jobs2minrumenjh.json
hadoop2.7.3/share/hadoop/tools/lib/
hadoop2.7.3/share/hadoop/tools/lib/hadoopgridmix
2.7.3.jar
hadoop2.7.3/share/hadoop/tools/lib/hadoopopenstack
2.7.3.jar
hadoop2.7.3/share/hadoop/tools/lib/guava11.0.2.jar
hadoop2.7.3/share/hadoop/tools/lib/jsr3053.0.0.jar
hadoop2.7.3/share/hadoop/tools/lib/commonscli
1.2.jar
hadoop2.7.3/share/hadoop/tools/lib/commonsmath3
3.1.1.jar
hadoop2.7.3/share/hadoop/tools/lib/xmlenc0.52.jar
hadoop2.7.3/share/hadoop/tools/lib/commons
httpclient3.1.jar
hadoop2.7.3/share/hadoop/tools/lib/commonslogging
1.1.3.jar
hadoop2.7.3/share/hadoop/tools/lib/commonscodec
1.4.jar
hadoop2.7.3/share/hadoop/tools/lib/commonsio
2.4.jar
hadoop2.7.3/share/hadoop/tools/lib/commonsnet
3.1.jar
hadoop2.7.3/share/hadoop/tools/lib/commons
collections3.2.2.jar
hadoop2.7.3/share/hadoop/tools/lib/servletapi
2.5.jar
hadoop2.7.3/share/hadoop/tools/lib/jetty6.1.26.jar
hadoop2.7.3/share/hadoop/tools/lib/jettyutil
6.1.26.jar
hadoop2.7.3/share/hadoop/tools/lib/jspapi2.1.jar
hadoop2.7.3/share/hadoop/tools/lib/jerseycore
1.9.jar
hadoop2.7.3/share/hadoop/tools/lib/jerseyjson
1.9.jar
hadoop2.7.3/share/hadoop/tools/lib/jettison1.1.jar
hadoop2.7.3/share/hadoop/tools/lib/jaxbimpl2.2.3
1.jar

hadoop2.7.3/share/hadoop/tools/lib/jaxbapi
2.2.2.jar
hadoop2.7.3/share/hadoop/tools/lib/staxapi1.0
2.jar
hadoop2.7.3/share/hadoop/tools/lib/activation
1.1.jar
hadoop2.7.3/share/hadoop/tools/lib/jacksoncoreasl
1.9.13.jar
hadoop2.7.3/share/hadoop/tools/lib/jacksonmapper
asl1.9.13.jar
hadoop2.7.3/share/hadoop/tools/lib/jacksonjaxrs
1.9.13.jar
hadoop2.7.3/share/hadoop/tools/lib/jacksonxc
1.9.13.jar
hadoop2.7.3/share/hadoop/tools/lib/jerseyserver
1.9.jar
hadoop2.7.3/share/hadoop/tools/lib/log4j1.2.17.jar
hadoop2.7.3/share/hadoop/tools/lib/jets3t0.9.0.jar
hadoop2.7.3/share/hadoop/tools/lib/httpclient
4.2.5.jar
hadoop2.7.3/share/hadoop/tools/lib/httpcore
4.2.5.jar
hadoop2.7.3/share/hadoop/tools/lib/javaxmlbuilder
0.4.jar
hadoop2.7.3/share/hadoop/tools/lib/commonslang
2.6.jar
hadoop2.7.3/share/hadoop/tools/lib/commons
configuration1.6.jar
hadoop2.7.3/share/hadoop/tools/lib/commonsdigester
1.8.jar
hadoop2.7.3/share/hadoop/tools/lib/commons
beanutils1.7.0.jar
hadoop2.7.3/share/hadoop/tools/lib/commons
beanutilscore1.8.0.jar
hadoop2.7.3/share/hadoop/tools/lib/avro1.7.4.jar
hadoop2.7.3/share/hadoop/tools/lib/paranamer2.3.jar
hadoop2.7.3/share/hadoop/tools/lib/snappyjava
1.0.4.1.jar
hadoop2.7.3/share/hadoop/tools/lib/commonscompress
1.4.1.jar
hadoop2.7.3/share/hadoop/tools/lib/xz1.0.jar
hadoop2.7.3/share/hadoop/tools/lib/protobufjava
2.5.0.jar
hadoop2.7.3/share/hadoop/tools/lib/gson2.2.4.jar
hadoop2.7.3/share/hadoop/tools/lib/hadoopauth
2.7.3.jar
hadoop2.7.3/share/hadoop/tools/lib/apacheds
kerberoscodec2.0.0M15.jar
hadoop2.7.3/share/hadoop/tools/lib/apachedsi18n
2.0.0M15.jar

10

hadoop2.7.3/share/hadoop/tools/lib/apiasn1api
1.0.0M20.jar
hadoop2.7.3/share/hadoop/tools/lib/apiutil1.0.0
M20.jar
hadoop2.7.3/share/hadoop/tools/lib/zookeeper
3.4.6.jar
hadoop2.7.3/share/hadoop/tools/lib/netty
3.6.2.Final.jar
hadoop2.7.3/share/hadoop/tools/lib/curator
framework2.7.1.jar
hadoop2.7.3/share/hadoop/tools/lib/curatorclient
2.7.1.jar
hadoop2.7.3/share/hadoop/tools/lib/jsch0.1.42.jar
hadoop2.7.3/share/hadoop/tools/lib/curatorrecipes
2.7.1.jar
hadoop2.7.3/share/hadoop/tools/lib/htracecore
3.1.0incubating.jar
hadoop2.7.3/share/hadoop/tools/lib/junit4.11.jar
hadoop2.7.3/share/hadoop/tools/lib/hamcrestcore
1.3.jar
hadoop2.7.3/share/hadoop/tools/lib/mockitoall
1.8.5.jar
hadoop2.7.3/share/hadoop/tools/lib/hadoopaws
2.7.3.jar
hadoop2.7.3/share/hadoop/tools/lib/jacksondatabind
2.2.3.jar
hadoop2.7.3/share/hadoop/tools/lib/jackson
annotations2.2.3.jar
hadoop2.7.3/share/hadoop/tools/lib/jacksoncore
2.2.3.jar
hadoop2.7.3/share/hadoop/tools/lib/awsjavasdk
1.7.4.jar
hadoop2.7.3/share/hadoop/tools/lib/jodatime
2.9.4.jar
hadoop2.7.3/share/hadoop/tools/lib/hadoopazure
2.7.3.jar
hadoop2.7.3/share/hadoop/tools/lib/azurestorage
2.0.0.jar
hadoop2.7.3/share/hadoop/tools/lib/commonslang3
3.3.2.jar
hadoop2.7.3/share/hadoop/tools/lib/hadoopsls
2.7.3.jar
hadoop2.7.3/share/hadoop/tools/lib/metricscore
3.0.1.jar
hadoop2.7.3/share/hadoop/tools/lib/hadoopant
2.7.3.jar
hadoop2.7.3/share/hadoop/tools/lib/hadoopstreaming
2.7.3.jar
hadoop2.7.3/share/hadoop/tools/lib/hadoopdistcp
2.7.3.jar

11

hadoop2.7.3/share/hadoop/tools/lib/hadooparchives
2.7.3.jar
hadoop2.7.3/share/hadoop/tools/lib/hadooprumen
2.7.3.jar
hadoop2.7.3/share/hadoop/tools/lib/hadoopdatajoin
2.7.3.jar
hadoop2.7.3/share/hadoop/tools/lib/hadoopextras
2.7.3.jar
hadoop2.7.3/share/hadoop/tools/lib/asm3.2.jar
hadoop2.7.3/include/
hadoop2.7.3/include/hdfs.h
hadoop2.7.3/include/Pipes.hh
hadoop2.7.3/include/TemplateFactory.hh
hadoop2.7.3/include/StringUtils.hh
hadoop2.7.3/include/SerialUtils.hh
hadoop2.7.3/LICENSE.txt
hadoop2.7.3/NOTICE.txt
hadoop2.7.3/README.txt
hduser@Master:~$sudomvhadoop2.7.3
/usr/local/hadoop
[sudo]passwordforhduser:
hduserisnotinthesudoersfile.Thisincident
willbereported.
hduser@Master:~$exit
exit
nidos@Master:~$sudoadduserhdusersudo
[sudo]passwordfornidos:
Addinguser`hduser'togroup`sudo'...
Addinguserhdusertogroupsudo
Done.
nidos@Master:~$suhduser
Password:
hduser@Master:/home/nidos$cd
hduser@Master:~$sudomvhadoop2.7.3
/usr/local/hadoop
[sudo]passwordforhduser:
hduser@Master:~$sudonano~/.
./.bash_history.bashrc.ssh/
../.bash_logout.profile
hduser@Master:~$sudonano~/.bashrc
hduser@Master:~$source~/.bashrc
hduser@Master:~$sudonano
/usr/local/hadoop/etc/hadoop/hadoopenv.sh
[sudo]passwordforhduser:
hduser@Master:~$sudonano
/usr/local/hadoop/etc/hadoop/coresite.xml
hduser@Master:~$cp
/usr/local/hadoop/etc/hadoop/mapredsite.xml.template
/usr/local/hadoop/etc/hadoop/mapredsite.xml
hduser@Master:~$sudonano

12

/usr/local/hadoop/etc/hadoop/mapredsite.xml
hduser@Master:~$sudonano
/usr/local/hadoop/etc/hadoop/mapredsite.xml
hduser@Master:~$sudomkdirp
/usr/local/hadoop_tmp/hdfs/namenode
hduser@Master:~$sudomkdirp
/usr/local/hadoop_tmp/hdfs/datanode
hduser@Master:~$sudochownRhduser
/usr/local/hadoop_tmp
hduser@Master:~$sudonano
/usr/local/hadoop/etc/hadoop/hdfssite.xml
hduser@Master:~$sudonano
/usr/local/hadoop/etc/hadoop/hdfssite.xml
hduser@Master:~$hdfsnamenodeformat
16/11/1401:45:23INFOnamenode.NameNode:
STARTUP_MSG:
/
***********************************************************
*
STARTUP_MSG:StartingNameNode
STARTUP_MSG:host=Master/127.0.1.1
STARTUP_MSG:args=[format]
STARTUP_MSG:version=2.7.3
STARTUP_MSG:classpath=
/usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop
/common/lib/log4j
1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305

3.0.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
beanutilscore
1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/api
asn1api1.0.0
M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey
core
1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/curator
framework
2.7.1.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcor
e
4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty
util
6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/gson
2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/api
util1.0.0
M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t
0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jettiso
n
1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds
i18n2.0.0
M15.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclien
t

13

4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
httpclient
3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator
client
2.7.1.jar:/usr/local/hadoop/share/hadoop/common/lib/xz
1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest
core
1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
digester
1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
configuration
1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper

3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
cli
1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
logging
1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j
log4j12
1.7.10.jar:/usr/local/hadoop/share/hadoop/common/lib/java
xmlbuilder
0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/staxapi
1.0
2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
net3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb
impl2.2.3
1.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch
0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/mockit
oall
1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson
xc
1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/parana
mer
2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc
0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/avro
1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j
api
1.7.10.jar:/usr/local/hadoop/share/hadoop/common/lib/asm
3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
beanutils
1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
io2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp
api
2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty
6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/common
scollections
3.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey
server
1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey
json

14

1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
math3
3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apached
skerberoscodec2.0.0
M15.jar:/usr/local/hadoop/share/hadoop/common/lib/junit
4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf
java
2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson
coreasl
1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace
core3.1.0
incubating.jar:/usr/local/hadoop/share/hadoop/common/lib/ne
tty
3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/j
acksonmapperasl
1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop
auth
2.7.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson
jaxrs
1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb
api
2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
codec
1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
lang
2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy
java
1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commo
nscompress
1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/guava
11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/curato
rrecipes
2.7.1.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet
api
2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop
annotations
2.7.3.jar:/usr/local/hadoop/share/hadoop/common/lib/activat
ion1.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop
common
2.7.3.jar:/usr/local/hadoop/share/hadoop/common/hadoop
common2.7.3
tests.jar:/usr/local/hadoop/share/hadoop/common/hadoopnfs
2.7.3.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/ha
doop/share/hadoop/hdfs/lib/log4j
1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305
3.0.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey
core1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty
util
6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons
daemon

15

1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/leveldbj
niall
1.8.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/nettyall
4.0.23.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/co
mmonscli
1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons
logging
1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImp
l2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc
0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm
3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commonsio
2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty
6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey
server1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml
apis
1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf
java
2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson
coreasl
1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace
core3.1.0
incubating.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/nett
y
3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jac
ksonmapperasl
1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons
codec
1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons
lang2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava
11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet
api2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop
hdfsnfs
2.7.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoophdfs
2.7.3tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop
hdfs
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j
1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305
3.0.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey
core1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty
util
6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison

1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni
all1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice
3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey
client1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz
1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper
3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons
cli
1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons

16

logging
1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/staxapi
1.02.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb
impl2.2.3
1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jerseyguice
1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jacksonxc
1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopallia
nce1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm
3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commonsio
2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty
6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeepe
r3.4.6
tests.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons
collections
3.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey
server
1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey
json1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice
servlet
3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf
java
2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson
coreasl
1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty
3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jac
ksonmapperasl
1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson
jaxrs
1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb
api
2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons
codec
1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons
lang
2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons
compress
1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava
11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet
api
2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.injec
t1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation
1.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
serverresourcemanager
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
client
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
serversharedcachemanager
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
applicationsunmanagedamlauncher
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn

17

servertests
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
registry
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
serverapplicationhistoryservice
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
applicationsdistributedshell
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
api2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop
yarnserverwebproxy
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
common
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
servernodemanager
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
servercommon
2.7.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4
j
1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jer
seycore
1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveld
bjniall
1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice
3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz
1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcre
stcore
1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey
guice
1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/parana
mer
2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopall
iance
1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro
1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm
3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/common
sio
2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey
server
1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit
4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice
servlet
3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protob
ufjava
2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jack
soncoreasl
1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/net
ty
3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/li
b/jacksonmapperasl
1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/sna

18

ppyjava
1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/co
mmonscompress
1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/java
x.inject
1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop
annotations
2.7.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop
mapreduceclienthsplugins
2.7.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop
mapreduceclientapp
2.7.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop
mapreduceclientcore
2.7.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop
mapreduceclientjobclient
2.7.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop
mapreduceclientshuffle
2.7.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop
mapreduceclientcommon
2.7.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop
mapreduceexamples
2.7.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop
mapreduceclientjobclient2.7.3
tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop
mapreduceclienths2.7.3.jar:/contrib/capacity
scheduler/*.jar
STARTUP_MSG:build=https://gitwip
us.apache.org/repos/asf/hadoop.gitr
baa91f7c6bc9cb92be5982de4719c1c8af91ccff;compiledby
'root'on20160818T01:41Z
STARTUP_MSG:java=1.7.0_111
*****************************************************
*******/
16/11/1401:45:24INFOnamenode.NameNode:registered
UNIXsignalhandlersfor[TERM,HUP,INT]
16/11/1401:45:24INFOnamenode.NameNode:
createNameNode[format]
[FatalError]hdfssite.xml:33:3:Theelementtype
"name"mustbeterminatedbythematchingendtag
"</name>".
16/11/1401:45:27FATALconf.Configuration:error
parsingconfhdfssite.xml
org.xml.sax.SAXParseException;systemId:
file:/usr/local/hadoop/etc/hadoop/hdfssite.xml;
lineNumber:33;columnNumber:3;Theelementtype"name"
mustbeterminatedbythematchingendtag"</name>".
atorg.apache.xerces.parsers.DOMParser.parse(Unknown
Source)
at
org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown

19

Source)
at
javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.jav
a:150)
at
org.apache.hadoop.conf.Configuration.parse(Configuration.ja
va:2480)
at
org.apache.hadoop.conf.Configuration.parse(Configuration.ja
va:2468)
at
org.apache.hadoop.conf.Configuration.loadResource(Configura
tion.java:2539)
at
org.apache.hadoop.conf.Configuration.loadResources(Configur
ation.java:2492)
at
org.apache.hadoop.conf.Configuration.getProps(Configuration
.java:2405)
at
org.apache.hadoop.conf.Configuration.set(Configuration.java
:1143)
at
org.apache.hadoop.conf.Configuration.set(Configuration.java
:1115)
at
org.apache.hadoop.conf.Configuration.setBoolean(Configurati
on.java:1451)
at
org.apache.hadoop.util.GenericOptionsParser.processGeneralO
ptions(GenericOptionsParser.java:321)
at
org.apache.hadoop.util.GenericOptionsParser.parseGeneralOpt
ions(GenericOptionsParser.java:487)
at
org.apache.hadoop.util.GenericOptionsParser.<init>(GenericO
ptionsParser.java:170)
at
org.apache.hadoop.util.GenericOptionsParser.<init>(GenericO
ptionsParser.java:153)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameN
ode(NameNode.java:1422)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNo
de.java:1559)
16/11/1401:45:27ERRORnamenode.NameNode:Failedto
startnamenode.
java.lang.RuntimeException:
org.xml.sax.SAXParseException;systemId:

20

file:/usr/local/hadoop/etc/hadoop/hdfssite.xml;
lineNumber:33;columnNumber:3;Theelementtype"name"
mustbeterminatedbythematchingendtag"</name>".
at
org.apache.hadoop.conf.Configuration.loadResource(Configura
tion.java:2645)
at
org.apache.hadoop.conf.Configuration.loadResources(Configur
ation.java:2492)
at
org.apache.hadoop.conf.Configuration.getProps(Configuration
.java:2405)
at
org.apache.hadoop.conf.Configuration.set(Configuration.java
:1143)
at
org.apache.hadoop.conf.Configuration.set(Configuration.java
:1115)
at
org.apache.hadoop.conf.Configuration.setBoolean(Configurati
on.java:1451)
at
org.apache.hadoop.util.GenericOptionsParser.processGeneralO
ptions(GenericOptionsParser.java:321)
at
org.apache.hadoop.util.GenericOptionsParser.parseGeneralOpt
ions(GenericOptionsParser.java:487)
at
org.apache.hadoop.util.GenericOptionsParser.<init>(GenericO
ptionsParser.java:170)
at
org.apache.hadoop.util.GenericOptionsParser.<init>(GenericO
ptionsParser.java:153)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameN
ode(NameNode.java:1422)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNo
de.java:1559)
Causedby:org.xml.sax.SAXParseException;systemId:
file:/usr/local/hadoop/etc/hadoop/hdfssite.xml;
lineNumber:33;columnNumber:3;Theelementtype"name"
mustbeterminatedbythematchingendtag"</name>".
atorg.apache.xerces.parsers.DOMParser.parse(Unknown
Source)
at
org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown
Source)
at
javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.jav

21

a:150)
at
org.apache.hadoop.conf.Configuration.parse(Configuration.ja
va:2480)
at
org.apache.hadoop.conf.Configuration.parse(Configuration.ja
va:2468)
at
org.apache.hadoop.conf.Configuration.loadResource(Configura
tion.java:2539)
...11more
16/11/1401:45:27INFOutil.ExitUtil:Exitingwith
status1
16/11/1401:45:27INFOnamenode.NameNode:
SHUTDOWN_MSG:
/
***********************************************************
*
SHUTDOWN_MSG:ShuttingdownNameNodeat
Master/127.0.1.1
*****************************************************
*******/
hduser@Master:~$

hduser@Master:~$ start-all.sh

This script is Deprecated. Instead use start-dfs.sh and startyarn.sh


16/11/14 01:51:08 WARN util.NativeCodeLoader: Unable to
load native-hadoop library for your platform... using builtinjava classes where applicable
Starting namenodes on [localhost]
The authenticity of host 'localhost (127.0.0.1)' can't be
established.
ECDSA
key
fingerprint
is
4d:af:12:7a:3a:61:e3:a4:dd:bf:eb:5a:4b:0d:8d:08.
Are you sure you want to continue connecting (yes/no)? yes

Kemudian muncul:

localhost: Warning: Permanently added 'localhost' (ECDSA)


to the list of known hosts.
localhost:
starting
namenode,
logging
to
/usr/local/hadoop/logs/hadoop-hduser-namenode-Master.out
localhost:
starting
datanode,
logging
to
/usr/local/hadoop/logs/hadoop-hduser-datanode-Master.out
Starting secondary namenodes [0.0.0.0]
22

The authenticity of host '0.0.0.0 (0.0.0.0)' can't be


established.
ECDSA
key
fingerprint
is
4d:af:12:7a:3a:61:e3:a4:dd:bf:eb:5a:4b:0d:8d:08.
Are you sure you want to continue connecting (yes/no)? yes

hduser@Master:~$ jps

13594 Jps
13198 ResourceManager
13053 SecondaryNameNode
12869 DataNode
13319 NodeManager

Keterangan: JPS adalah JVM Process Status Tool


http://localhost:50070

Coba menjalankankode wordcount:


buat folder data di Desktop
hduser@Master:~$ sudo mkdir -p /home/nidos/Desktop/data
[sudo] password for hduser:
hduser@Master:~$
sudo
chown
-R
hduser
/home/nidos/Desktop
hduser@Master:~$
sudo
chown
-R
nidos
/home/nidos/Desktop
atau, gunakan
whoami
chmod -R 700 /path/to/the/directory
23

chmod -R 777 /path/to/the/directory

buat file txt

hduser@Master:~$ cd /home/nidos/Desktop/data/
hduser@Master:/home/nidos/Desktop/data$ sudo jps >>
testing.txt
hduser@Master:/home/nidos/Desktop/data$ cd
hduser@Master:~$ cd /usr/local/hadoop
hduser@Master:/usr/local/hadoop$ bin/hdfs dfs -mkdir /user
hduser@Master:/usr/local/hadoop$
/user/hduser

bin/hdfs

dfs

-mkdir

Cek di http://localhost:50070

Gambar 2 Cek File System


hduser@Master:/usr/local/hadoop$
/home/nidos/Desktop/data input

bin/hdfs

dfs

-put

hduser@Master:/usr/local/hadoop$
bin/hadoop
jar
share/hadoop/mapreduce/hadoop-mapreduce-examples2.7.3.jar wordcount input output

24

hduser@Master:/usr/local/hadoop$bin/hadoopjar
share/hadoop/mapreduce/hadoopmapreduce
examples2.7.3.jarwordcountinputoutput
hduser@Master:/usr/local/hadoop$bin/hadoopjar
share/hadoop/mapreduce/hadoopmapreduceexamples2.7.3.jar
wordcountinputoutput
16/11/1403:39:29WARNutil.NativeCodeLoader:Unable
toloadnativehadooplibraryforyourplatform...using
builtinjavaclasseswhereapplicable
16/11/1403:39:37INFOConfiguration.deprecation:
session.idisdeprecated.Instead,usedfs.metrics.session
id
16/11/1403:39:37INFOjvm.JvmMetrics:Initializing
JVMMetricswithprocessName=JobTracker,sessionId=
16/11/1403:39:40INFOinput.FileInputFormat:Total
inputpathstoprocess:1
16/11/1403:39:40INFOmapreduce.JobSubmitter:number
ofsplits:1
16/11/1403:39:43INFOmapreduce.JobSubmitter:
Submittingtokensforjob:job_local237463946_0001
16/11/1403:39:45INFOmapreduce.Job:Theurlto
trackthejob:http://localhost:8080/
16/11/1403:39:46INFOmapreduce.Job:Runningjob:
job_local237463946_0001
16/11/1403:39:46INFOmapred.LocalJobRunner:
OutputCommittersetinconfignull
16/11/1403:39:46INFOoutput.FileOutputCommitter:
FileOutputCommitterAlgorithmversionis1
16/11/1403:39:46INFOmapred.LocalJobRunner:
OutputCommitteris
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
16/11/1403:39:47INFOmapreduce.Job:Job
job_local237463946_0001runninginubermode:false
16/11/1403:39:47INFOmapreduce.Job:map0%reduce
0%
16/11/1403:39:47INFOmapred.LocalJobRunner:Waiting
formaptasks
16/11/1403:39:47INFOmapred.LocalJobRunner:
Startingtask:attempt_local237463946_0001_m_000000_0
16/11/1403:39:48INFOoutput.FileOutputCommitter:
FileOutputCommitterAlgorithmversionis1
16/11/1403:39:48INFOmapred.Task:Using
ResourceCalculatorProcessTree:[]
16/11/1403:39:48INFOmapred.MapTask:Processing
split:
hdfs://localhost:9000/user/hduser/input/testing.txt:0+104

25

16/11/1403:39:54INFOmapred.MapTask:(EQUATOR)0
kvi26214396(104857584)
16/11/1403:39:54INFOmapred.MapTask:
mapreduce.task.io.sort.mb:100
16/11/1403:39:54INFOmapred.MapTask:softlimitat
83886080
16/11/1403:39:54INFOmapred.MapTask:bufstart=0;
bufvoid=104857600
16/11/1403:39:54INFOmapred.MapTask:kvstart=
26214396;length=6553600
16/11/1403:39:57INFOmapred.MapTask:Mapoutput
collectorclass=
org.apache.hadoop.mapred.MapTask$MapOutputBuffer
16/11/1403:40:05INFOmapred.LocalJobRunner:
16/11/1403:40:05INFOmapred.MapTask:Startingflush
ofmapoutput
16/11/1403:40:05INFOmapred.MapTask:Spillingmap
output
16/11/1403:40:05INFOmapred.MapTask:bufstart=0;
bufend=152;bufvoid=104857600
16/11/1403:40:05INFOmapred.MapTask:kvstart=
26214396(104857584);kvend=26214352(104857408);length=
45/6553600
16/11/1403:40:05INFOmapred.MapTask:Finishedspill
0
16/11/1403:40:05INFOmapred.Task:
Task:attempt_local237463946_0001_m_000000_0isdone.Andis
intheprocessofcommitting
16/11/1403:40:05INFOmapred.LocalJobRunner:map
16/11/1403:40:05INFOmapred.Task:Task
'attempt_local237463946_0001_m_000000_0'done.
16/11/1403:40:05INFOmapred.LocalJobRunner:
Finishingtask:attempt_local237463946_0001_m_000000_0
16/11/1403:40:05INFOmapred.LocalJobRunner:map
taskexecutorcomplete.
16/11/1403:40:05INFOmapred.LocalJobRunner:Waiting
forreducetasks
16/11/1403:40:05INFOmapred.LocalJobRunner:
Startingtask:attempt_local237463946_0001_r_000000_0
16/11/1403:40:06INFOoutput.FileOutputCommitter:
FileOutputCommitterAlgorithmversionis1
16/11/1403:40:06INFOmapred.Task:Using
ResourceCalculatorProcessTree:[]
16/11/1403:40:06INFOmapred.ReduceTask:Using
ShuffleConsumerPlugin:
org.apache.hadoop.mapreduce.task.reduce.Shuffle@50060da6
16/11/1403:40:06INFOmapreduce.Job:map100%
reduce0%
16/11/1403:40:06INFOreduce.MergeManagerImpl:
MergerManager:memoryLimit=363285696,

26

maxSingleShuffleLimit=90821424,mergeThreshold=239768576,
ioSortFactor=10,memToMemMergeOutputsThreshold=10
16/11/1403:40:06INFOreduce.EventFetcher:
attempt_local237463946_0001_r_000000_0Threadstarted:
EventFetcherforfetchingMapCompletionEvents
16/11/1403:40:06INFOreduce.LocalFetcher:
localfetcher#1abouttoshuffleoutputofmap
attempt_local237463946_0001_m_000000_0decomp:178len:182
toMEMORY
16/11/1403:40:07INFOreduce.InMemoryMapOutput:Read
178bytesfrommapoutputfor
attempt_local237463946_0001_m_000000_0
16/11/1403:40:07INFOreduce.MergeManagerImpl:
closeInMemoryFile>mapoutputofsize:178,
inMemoryMapOutputs.size()>1,commitMemory>0,
usedMemory>178
16/11/1403:40:07INFOreduce.EventFetcher:
EventFetcherisinterrupted..Returning
16/11/1403:40:07INFOmapred.LocalJobRunner:1/1
copied.
16/11/1403:40:07INFOreduce.MergeManagerImpl:
finalMergecalledwith1inmemorymapoutputsand0on
diskmapoutputs
16/11/1403:40:07INFOmapred.Merger:Merging1
sortedsegments
16/11/1403:40:07INFOmapred.Merger:Downtothe
lastmergepass,with1segmentsleftoftotalsize:170
bytes
16/11/1403:40:07INFOreduce.MergeManagerImpl:
Merged1segments,178bytestodisktosatisfyreduce
memorylimit
16/11/1403:40:07INFOreduce.MergeManagerImpl:
Merging1files,182bytesfromdisk
16/11/1403:40:07INFOreduce.MergeManagerImpl:
Merging0segments,0bytesfrommemoryintoreduce
16/11/1403:40:07INFOmapred.Merger:Merging1
sortedsegments
16/11/1403:40:07INFOmapred.Merger:Downtothe
lastmergepass,with1segmentsleftoftotalsize:170
bytes
16/11/1403:40:07INFOmapred.LocalJobRunner:1/1
copied.
16/11/1403:40:07INFOConfiguration.deprecation:
mapred.skip.onisdeprecated.Instead,use
mapreduce.job.skiprecords
16/11/1403:40:09INFOmapred.Task:
Task:attempt_local237463946_0001_r_000000_0isdone.Andis
intheprocessofcommitting
16/11/1403:40:09INFOmapred.LocalJobRunner:1/1
copied.

27

16/11/1403:40:09INFOmapred.Task:Task
attempt_local237463946_0001_r_000000_0isallowedtocommit
now
16/11/1403:40:09INFOoutput.FileOutputCommitter:
Savedoutputoftask
'attempt_local237463946_0001_r_000000_0'to
hdfs://localhost:9000/user/hduser/output/_temporary/0/task_
local237463946_0001_r_000000
16/11/1403:40:09INFOmapred.LocalJobRunner:reduce
>reduce
16/11/1403:40:09INFOmapred.Task:Task
'attempt_local237463946_0001_r_000000_0'done.
16/11/1403:40:09INFOmapred.LocalJobRunner:
Finishingtask:attempt_local237463946_0001_r_000000_0
16/11/1403:40:09INFOmapred.LocalJobRunner:reduce
taskexecutorcomplete.
16/11/1403:40:10INFOmapreduce.Job:map100%
reduce100%
16/11/1403:40:10INFOmapreduce.Job:Job
job_local237463946_0001completedsuccessfully
16/11/1403:40:10INFOmapreduce.Job:Counters:35
FileSystemCounters
FILE:Numberofbytesread=592354
FILE:Numberofbyteswritten=1163504
FILE:Numberofreadoperations=0
FILE:Numberoflargereadoperations=0
FILE:Numberofwriteoperations=0
HDFS:Numberofbytesread=208
HDFS:Numberofbyteswritten=128
HDFS:Numberofreadoperations=13
HDFS:Numberoflargereadoperations=0
HDFS:Numberofwriteoperations=4
MapReduceFramework
Mapinputrecords=6
Mapoutputrecords=12
Mapoutputbytes=152
Mapoutputmaterializedbytes=182
Inputsplitbytes=116
Combineinputrecords=12
Combineoutputrecords=12
Reduceinputgroups=12
Reduceshufflebytes=182
Reduceinputrecords=12
Reduceoutputrecords=12
SpilledRecords=24
ShuffledMaps=1
FailedShuffles=0
MergedMapoutputs=1
GCtimeelapsed(ms)=297
Totalcommittedheapusage(bytes)=240656384

28

ShuffleErrors
BAD_ID=0
CONNECTION=0
IO_ERROR=0
WRONG_LENGTH=0
WRONG_MAP=0
WRONG_REDUCE=0
FileInputFormatCounters
BytesRead=104
FileOutputFormatCounters
BytesWritten=128
hduser@Master:/usr/local/hadoop$

hduser@Master:/usr/local/hadoop$ bin/hdfs dfs -cat output/*


hduser@Master:/usr/local/hadoop$bin/hdfsdfs
catoutput/*
hduser@Master:/usr/local/hadoop$bin/hdfsdfscat
output/*
16/11/1403:44:34WARNutil.NativeCodeLoader:Unable
toloadnativehadooplibraryforyourplatform...using
builtinjavaclasseswhereapplicable
17228
1
17351
1
17517
1
17728
1
17855
1
18407
1
DataNode 1
Jps
1
NameNode 1
NodeManager
1
ResourceManager 1
SecondaryNameNode
1
hduser@Master:/usr/local/hadoop$

29

Gambar 3 Cek File System /user/hduser input output

Gambar 4 Lihat file input "testing.txt"

Gambar 5 LIhat file hasil output wordcount

30

1.1 Hadoop Multi Cluster On Linux


(Draft)
1.1.1 Membuat PC master dan node1, .., node3
Pada multi cluster ini, digunakan PC master, PC node1, PC
node2, dan PC node3.
3. Membuat PC master, dengan spesifikasi konfigurasi sebagai
berikut

Gambar 6 PC master (tampilan ke-1)

31

Gambar 8 PC master (tampilan ke-2)

32

Semuaperintahdalamterminal(Linux)
nidos@master:~$sudogedit/etc/hostname
[sudo]passwordfornidos:
nidos@master:~$sudogedit
/usr/local/hadoop/etc/hadoop/slaves
nidos@master:~$sudogedit
/usr/local/hadoop/etc/hadoop/mapredsite.xml
nidos@master:~$sudogedit
/usr/local/hadoop/etc/hadoop/yarnsite.xml
nidos@master:~$sudoigedit
/usr/local/hadoop/etc/hadoop/hdfssite.xml
nidos@master:~$sudogedit
/usr/local/hadoop/etc/hadoop/coresite.xml
nidos@master:~$sudogedit/etc/hosts
nidos@master:~$

C:\hadoopx.x.x\etc\hadoop\coresite.xml
33

<?xmlversion="1.0"encoding="UTF8"?>
<?xmlstylesheettype="text/xsl"
href="configuration.xsl"?>
<!
LicensedundertheApacheLicense,Version2.0(the
"License");
youmaynotusethisfileexceptincompliancewith
theLicense.
YoumayobtainacopyoftheLicenseat
http://www.apache.org/licenses/LICENSE2.0
Unlessrequiredbyapplicablelaworagreedtoin
writing,software
distributedundertheLicenseisdistributedonan
"ASIS"BASIS,
WITHOUTWARRANTIESORCONDITIONSOFANYKIND,
eitherexpressorimplied.
SeetheLicenseforthespecificlanguagegoverning
permissionsand
limitationsundertheLicense.Seeaccompanying
LICENSEfile.
>
<!Putsitespecificpropertyoverridesinthis
file.>
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://master:9000</value>
</property>
</configuration>

C:\hadoopx.x.x\etc\hadoop\hdfssite.xml
<?xmlversion="1.0"encoding="UTF8"?>
<?xmlstylesheettype="text/xsl"
href="configuration.xsl"?>
<!

34

LicensedundertheApacheLicense,Version2.0(the
"License");
youmaynotusethisfileexceptincompliancewith
theLicense.
YoumayobtainacopyoftheLicenseat
http://www.apache.org/licenses/LICENSE2.0
Unlessrequiredbyapplicablelaworagreedtoin
writing,software
distributedundertheLicenseisdistributedonan
"ASIS"BASIS,
WITHOUTWARRANTIESORCONDITIONSOFANYKIND,
eitherexpressorimplied.
SeetheLicenseforthespecificlanguagegoverning
permissionsand
limitationsundertheLicense.Seeaccompanying
LICENSEfile.
>
<!Putsitespecificpropertyoverridesinthis
file.>
<configuration>

<property>
<name>dfs.replication</name>
<value>3</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>file:/usr/local/hadoop_tmp/hdfs/namenode</value>
</property>

</configuration>

C:\hadoopx.x.x\etc\hadoop\yarnsite.xml
<?xmlversion="1.0"?>
<!
LicensedundertheApacheLicense,Version2.0(the
"License");
youmaynotusethisfileexceptincompliancewith

35

theLicense.
YoumayobtainacopyoftheLicenseat
http://www.apache.org/licenses/LICENSE2.0
Unlessrequiredbyapplicablelaworagreedtoin
writing,software
distributedundertheLicenseisdistributedonan
"ASIS"BASIS,
WITHOUTWARRANTIESORCONDITIONSOFANYKIND,
eitherexpressorimplied.
SeetheLicenseforthespecificlanguagegoverning
permissionsand
limitationsundertheLicense.Seeaccompanying
LICENSEfile.
>
<configuration>
<property>
<name>yarn.resourcemanager.resource
tracker.address</name>
<value>master:8025</value>
</property>
<property>
<name>yarn.resourcemanager.scheduler.address</name>
<value>master:8035</value>
</property>
<property>
<name>yarn.resourcemanager.address</name>
<value>master:8050</value>
</property>
</configuration>

C:\hadoopx.x.x\etc\hadoop\mapredsite.xml
<?xmlversion="1.0"?>
<!
LicensedundertheApacheLicense,Version2.0(the
"License");
youmaynotusethisfileexceptincompliancewith
theLicense.
YoumayobtainacopyoftheLicenseat
http://www.apache.org/licenses/LICENSE2.0

36

Unlessrequiredbyapplicablelaworagreedtoin
writing,software
distributedundertheLicenseisdistributedonan
"ASIS"BASIS,
WITHOUTWARRANTIESORCONDITIONSOFANYKIND,
eitherexpressorimplied.
SeetheLicenseforthespecificlanguagegoverning
permissionsand
limitationsundertheLicense.Seeaccompanying
LICENSEfile.
>
<configuration>
<property>
<name>mapreduce.jab.tracker</name>
<value>master:54311</value>
</property>
<property>
<name>mapred.framework.name</name>
<value>yarn</value>
</property>
</configuration>

4. Install jdk untuk Linux


a. Buka terminal, ketikan nidos@master:~$ sudo apt-get
update, setelah done.
b. Ketikkan nidos@master:~$ sudo apt-get install default-jdk,
setelah done, ketikkan nidos@master:~$ java -version.

nidos@master:~$javaversion
nidos@master:~$javaversion
javaversion"1.7.0_111"
OpenJDKRuntimeEnvironment(IcedTea2.6.7)(7u111
2.6.70ubuntu0.14.04.3)
OpenJDK64BitServerVM(build24.111b01,mixed
mode)

5. Install ssh, ketikkan nidos@master:~$ sudo apt-get install ssh,


setelah done, lalu ketikkan
nidos@master:~$ ssh-keygen -t rsa -P ""

37

jika ada pernyataan Enter file in which to save the key


(/home/nidos/.ssh/id_rsa):, tekan tombol enter
nidos@master:~$sshkeygentrsaP""
nidos@master:~$sshkeygentrsaP""
Generatingpublic/privatersakeypair.
Enterfileinwhichtosavethekey
(/home/nidos/.ssh/id_rsa):
Createddirectory'/home/nidos/.ssh'.
Youridentificationhasbeensavedin
/home/nidos/.ssh/id_rsa.
Yourpublickeyhasbeensavedin
/home/nidos/.ssh/id_rsa.pub.
Thekeyfingerprintis:
4c:76:eb:8d:f8:25:ce:3c:37:d2:11:26:f4:70:5c:e0
nidos@master
Thekey'srandomartimageis:
+[RSA2048]+
|..o.|
|o.o|
|o..+E|
|+...+|
|S.o.|
|oo.|
|.+.o.|
|=oo+|
|=+.|
++

Lalu ketikkan nidos@master:~$ cat $HOME/.ssh/id_rsa.pub >>


$HOME/.ssh/authorized_keys
6. Edit IP address untuk PC master, ketiikan nidos@master:~$
ifconfig, maka akan tampil seperti ini

nidos@master:~$ifconfig
nidos@master:~$ifconfig
eth0Linkencap:EthernetHWaddr
08:00:27:77:9f:44
inetaddr:10.0.2.15Bcast:10.0.2.255
Mask:255.255.255.0
inet6addr:fe80::a00:27ff:fe77:9f44/64

38

Scope:Link
UPBROADCASTRUNNINGMULTICASTMTU:1500
Metric:1
RXpackets:53374errors:0dropped:0
overruns:0frame:0
TXpackets:26386errors:0dropped:0
overruns:0carrier:0
collisions:0txqueuelen:1000
RXbytes:73964085(73.9MB)TX
bytes:1827396(1.8MB)
loLinkencap:LocalLoopback
inetaddr:127.0.0.1Mask:255.0.0.0
inet6addr:::1/128Scope:Host
UPLOOPBACKRUNNINGMTU:65536Metric:1
RXpackets:2794errors:0dropped:0
overruns:0frame:0
TXpackets:2794errors:0dropped:0
overruns:0carrier:0
collisions:0txqueuelen:0
RXbytes:222430(222.4KB)TXbytes:222430
(222.4KB)

Uncheck Enable Networking, lalu cheked Enable Networking


lagi. Lalu klik Edit Connections..

Lalu klik save


7. Lakukan edit nidos@master:~$ sudo gedit ~/.bashrc
39

nidos@master:~$sudogedit~/.bashrc
#~/.bashrc:executedbybash(1)fornonlogin
shells.
#see/usr/share/doc/bash/examples/startupfiles(in
thepackagebashdoc)
#forexamples
#Ifnotrunninginteractively,don'tdoanything
case$in
*i*);;
*)return;;
esac
#don'tputduplicatelinesorlinesstartingwith
spaceinthehistory.
#Seebash(1)formoreoptions
HISTCONTROL=ignoreboth
#appendtothehistoryfile,don'toverwriteit
shoptshistappend
#forsettinghistorylengthseeHISTSIZEand
HISTFILESIZEinbash(1)
HISTSIZE=1000

40

HISTFILESIZE=2000
#checkthewindowsizeaftereachcommandand,if
necessary,
#updatethevaluesofLINESandCOLUMNS.
shoptscheckwinsize
#Ifset,thepattern"**"usedinapathname
expansioncontextwill
#matchallfilesandzeroormoredirectoriesand
subdirectories.
#shoptsglobstar
#makelessmorefriendlyfornontextinputfiles,
seelesspipe(1)
[x/usr/bin/lesspipe]&&eval"$(SHELL=/bin/sh
lesspipe)"
#setvariableidentifyingthechrootyouworkin
(usedinthepromptbelow)
if[z"${debian_chroot:}"]&&[r
/etc/debian_chroot];then
debian_chroot=$(cat/etc/debian_chroot)
fi
#setafancyprompt(noncolor,unlessweknowwe
"want"color)
case"$TERM"in
xtermcolor)color_prompt=yes;;
esac
#uncommentforacoloredprompt,iftheterminalhas
thecapability;turned
#offbydefaulttonotdistracttheuser:thefocus
inaterminalwindow
#shouldbeontheoutputofcommands,notonthe
prompt
#force_color_prompt=yes
if[n"$force_color_prompt"];then
if[x/usr/bin/tput]&&tputsetaf1
>&/dev/null;then
#Wehavecolorsupport;assumeit'scompliantwith
Ecma48
#(ISO/IEC6429).(Lackofsuchsupportisextremely
rare,andsuch
#acasewouldtendtosupportsetfratherthan
setaf.)
color_prompt=yes
else

41

color_prompt=
fi
fi
if["$color_prompt"=yes];then
PS1='${debian_chroot:+($debian_chroot)}\
[\033[01;32m\]\u@\h\[\033[00m\]:\[\033[01;34m\]\w\
[\033[00m\]\$'
else
PS1='${debian_chroot:+($debian_chroot)}\u@\h:\w\$
'
fi
unsetcolor_promptforce_color_prompt
#Ifthisisanxtermsetthetitletouser@host:dir
case"$TERM"in
xterm*|rxvt*)
PS1="\[\e]0;${debian_chroot:+
($debian_chroot)}\u@\h:\w\a\]$PS1"
;;
*)
;;
esac
#enablecolorsupportoflsandalsoaddhandy
aliases
if[x/usr/bin/dircolors];then
testr~/.dircolors&&eval"$(dircolorsb
~/.dircolors)"||eval"$(dircolorsb)"
aliasls='lscolor=auto'
#aliasdir='dircolor=auto'
#aliasvdir='vdircolor=auto'
aliasgrep='grepcolor=auto'
aliasfgrep='fgrepcolor=auto'
aliasegrep='egrepcolor=auto'
fi
#somemorelsaliases
aliasll='lsalF'
aliasla='lsA'
aliasl='lsCF'
#Addan"alert"aliasforlongrunningcommands.
Uselikeso:
#sleep10;alert
aliasalert='notifysendurgency=lowi"$([$?=0
]&&echoterminal||echoerror)""$(history|tailn1|sed
e'\''s/^\s*[09]\+\s*//;s/[;&|]\s*alert$//'\'')"'

42

#Aliasdefinitions.
#Youmaywanttoputallyouradditionsintoa
separatefilelike
#~/.bash_aliases,insteadofaddingthemhere
directly.
#See/usr/share/doc/bashdoc/examplesinthebash
docpackage.
if[f~/.bash_aliases];then
.~/.bash_aliases
fi
#enableprogrammablecompletionfeatures(youdon't
needtoenable
#this,ifit'salreadyenabledin/etc/bash.bashrc
and/etc/profile
#sources/etc/bash.bashrc).
if!shoptoqposix;then
if[f/usr/share/bash
completion/bash_completion];then
./usr/share/bashcompletion/bash_completion
elif[f/etc/bash_completion];then
./etc/bash_completion
fi
fi
exportJAVA_HOME=/usr/lib/jvm/java7openjdkamd64
exportHADOOP_INSTALL=/usr/local/hadoop
exportPATH=$PATH:$HADOOP_INSTALL/bin
exportPATH=$PATH:$HADOOP_INSTALL/sbin
exportHADOOP_MAPRED_HOME=$HADOOP_INSTALL
exportHADOOP_COMMON_HOME=$HADOOP_INSTALL
exportHADOOP_HDFS_HOME=$HADOOP_INSTALL
exportYARN_HOME=$HADOOP_INSTALL
export
HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
exportHADOOP_OPTS="
Djava.library.path=$HADOOP_INSTALL/lib"

Lalu ketikkan nidos@master:~$ source ~/.bashrc, lalu tekan


enter.
8. Lalu
ketikkan
nidos@master:~$
/usr/local/hadoop/etc/hadoop/hadoop-env.sh

43

sudo

gedit

Edit JAVA_HOME menjadi


# The java implementation to use.
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
9. Ketikkan
nidos@master:~$
sudo
/usr/local/hadoop/etc/hadoop/core-site.xml, lalu ubah

44

gedit

Ketikkan
nidos@master:~$
sudo
/usr/local/hadoop/etc/hadoop/mapred-site.xml, lalu ubah

gedit

10. Ketikkan nidos@master:~$ hdfs namenode -format

nidos@master:~$hdfsnamenodeformat
nidos@master:~$hdfsnamenodeformat
16/10/3113:47:21INFOnamenode.NameNode:
STARTUP_MSG:
/
***********************************************************
*
STARTUP_MSG:StartingNameNode
STARTUP_MSG:host=master/127.0.0.1
STARTUP_MSG:args=[format]
STARTUP_MSG:version=2.7.3
STARTUP_MSG:classpath=
/usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop
/common/lib/jettyutil
6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/common
sbeanutilscore
1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson
jaxrs
1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/common
sconfiguration
1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/asm
3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j

45

1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/avro
1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
httpclient
3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator
framework
2.7.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop
auth
2.7.3.jar:/usr/local/hadoop/share/hadoop/common/lib/guava
11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/zookee
per
3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/stax
api1.0
2.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace
core3.1.0
incubating.jar:/usr/local/hadoop/share/hadoop/common/lib/ja
cksonxc
1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/api
util1.0.0
M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty
6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop
annotations
2.7.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t
0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey
server
1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
io2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp
api
2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
lang
2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/activatio
n
1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
beanutils
1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcli
ent
4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb
impl2.2.3
1.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey
core
1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc
0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy
java
1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apach
edsi18n2.0.0
M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
math3
3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/gson
2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305
3.0.0.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet
api

46

2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j
log4j12
1.7.10.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey
json
1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
net
3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest
core
1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/netty
3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/j
unit
4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
digester
1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
collections
3.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/curator
recipes
2.7.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson
mapperasl
1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/mockit
oall
1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/xz
1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds
kerberoscodec2.0.0
M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
logging
1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
compress
1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcor
e4.2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch
0.1.42.jar:/usr/local/hadoop/share/hadoop/common/lib/parana
mer
2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
codec
1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf
java
2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j
api
1.7.10.jar:/usr/local/hadoop/share/hadoop/common/lib/jackso
ncoreasl
1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jettis
on
1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator
client
2.7.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons
cli1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/api
asn1api1.0.0
M20.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxbapi
2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/java
xmlbuilder

47

0.4.jar:/usr/local/hadoop/share/hadoop/common/hadoopnfs
2.7.3.jar:/usr/local/hadoop/share/hadoop/common/hadoop
common
2.7.3.jar:/usr/local/hadoop/share/hadoop/common/hadoop
common2.7.3
tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/ha
doop/share/hadoop/hdfs/lib/jettyutil
6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm
3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j
1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava
11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace
core3.1.0
incubating.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jett
y
6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey
server
1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commonsio
2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons
lang
2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl
2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey
core
1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc
0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons
daemon
1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/leveldbj
niall
1.8.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305
3.0.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet
api2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty
3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jac
ksonmapperasl
1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml
apis
1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons
logging
1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons
codec
1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf
java
2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson
coreasl
1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons
cli1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty
all
4.0.23.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop
hdfsnfs
2.7.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoophdfs
2.7.3.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoophdfs
2.7.3

48

tests.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty
util
6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson
jaxrs
1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm
3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j
1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.in
ject1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava
11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeepe
r3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax
api1.0
2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jacksonxc
1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty
6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey
server
1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commonsio
2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice
servlet
3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons
lang
2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation
1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxbimpl
2.2.31.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey
core
1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni
all1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305
3.0.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet
api2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey
json
1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance
1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty
3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/com
monscollections
3.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson
mapperasl
1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey
guice1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz
1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey
client
1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons
logging
1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons
compress
1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons
codec
1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf
java
2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson
coreasl
1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison

49

1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice
3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons
cli
1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper
3.4.6
tests.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxbapi
2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
serverapplicationhistoryservice
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
client
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
common
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
serverresourcemanager
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
registry
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
servertests
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
serverwebproxy
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
serversharedcachemanager
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
api2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop
yarnapplicationsunmanagedamlauncher
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
servercommon
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
applicationsdistributedshell
2.7.3.jar:/usr/local/hadoop/share/hadoop/yarn/hadoopyarn
servernodemanager
2.7.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm
3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j
1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avr
o
1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/java
x.inject
1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop
annotations
2.7.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jers
eyserver
1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/common
sio
2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice
servlet
3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey
core
1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy
java
1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/le

50

veldbjniall
1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopall
iance
1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcre
stcore
1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty
3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/li
b/junit
4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jacks
onmapperasl
1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jer
seyguice
1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz
1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/common
scompress
1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/para
namer
2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protob
ufjava
2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jack
soncoreasl
1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/gui
ce3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop
mapreduceclientcore
2.7.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop
mapreduceclientjobclient2.7.3
tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop
mapreduceclientapp
2.7.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop
mapreduceclientjobclient
2.7.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop
mapreduceclientshuffle
2.7.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop
mapreduceclienths
2.7.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop
mapreduceexamples
2.7.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop
mapreduceclienthsplugins
2.7.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop
mapreduceclientcommon2.7.3.jar:/contrib/capacity
scheduler/*.jar
STARTUP_MSG:build=https://gitwip
us.apache.org/repos/asf/hadoop.gitr
baa91f7c6bc9cb92be5982de4719c1c8af91ccff;compiledby
'root'on20160818T01:41Z
STARTUP_MSG:java=1.7.0_111
*****************************************************
*******/
16/10/3113:47:21INFOnamenode.NameNode:registered
UNIXsignalhandlersfor[TERM,HUP,INT]

51

16/10/3113:47:21INFOnamenode.NameNode:
createNameNode[format]
16/10/3113:47:26WARNutil.NativeCodeLoader:Unable
toloadnativehadooplibraryforyourplatform...using
builtinjavaclasseswhereapplicable
Formattingusingclusterid:CIDc2534134a1e54b24
b2c03c61f0cbfbf9
16/10/3113:47:28INFOnamenode.FSNamesystem:No
KeyProviderfound.
16/10/3113:47:28INFOnamenode.FSNamesystem:fsLock
isfair:true
16/10/3113:47:29INFO
blockmanagement.DatanodeManager:
dfs.block.invalidate.limit=1000
16/10/3113:47:29INFO
blockmanagement.DatanodeManager:
dfs.namenode.datanode.registration.iphostnamecheck=true
16/10/3113:47:29INFOblockmanagement.BlockManager:
dfs.namenode.startup.delay.block.deletion.secissetto
000:00:00:00.000
16/10/3113:47:29INFOblockmanagement.BlockManager:
Theblockdeletionwillstartaround2016Oct3113:47:29
16/10/3113:47:29INFOutil.GSet:Computingcapacity
formapBlocksMap
16/10/3113:47:29INFOutil.GSet:VMtype=64
bit
16/10/3113:47:29INFOutil.GSet:2.0%maxmemory
966.7MB=19.3MB
16/10/3113:47:29INFOutil.GSet:capacity=
2^21=2097152entries
16/10/3113:47:29INFOblockmanagement.BlockManager:
dfs.block.access.token.enable=false
16/10/3113:47:29INFOblockmanagement.BlockManager:
defaultReplication=3
16/10/3113:47:29INFOblockmanagement.BlockManager:
maxReplication=512
16/10/3113:47:29INFOblockmanagement.BlockManager:
minReplication=1
16/10/3113:47:29INFOblockmanagement.BlockManager:
maxReplicationStreams=2
16/10/3113:47:29INFOblockmanagement.BlockManager:
replicationRecheckInterval=3000
16/10/3113:47:29INFOblockmanagement.BlockManager:
encryptDataTransfer=false
16/10/3113:47:29INFOblockmanagement.BlockManager:
maxNumBlocksToLog=1000
16/10/3113:47:29INFOnamenode.FSNamesystem:fsOwner
=nidos(auth:SIMPLE)
16/10/3113:47:29INFOnamenode.FSNamesystem:
supergroup=supergroup

52

16/10/3113:47:29INFOnamenode.FSNamesystem:
isPermissionEnabled=true
16/10/3113:47:29INFOnamenode.FSNamesystem:HA
Enabled:false
16/10/3113:47:29INFOnamenode.FSNamesystem:Append
Enabled:true
16/10/3113:47:30INFOutil.GSet:Computingcapacity
formapINodeMap
16/10/3113:47:30INFOutil.GSet:VMtype=64
bit
16/10/3113:47:30INFOutil.GSet:1.0%maxmemory
966.7MB=9.7MB
16/10/3113:47:30INFOutil.GSet:capacity=
2^20=1048576entries
16/10/3113:47:30INFOnamenode.FSDirectory:ACLs
enabled?false
16/10/3113:47:30INFOnamenode.FSDirectory:XAttrs
enabled?true
16/10/3113:47:30INFOnamenode.FSDirectory:Maximum
sizeofanxattr:16384
16/10/3113:47:30INFOnamenode.NameNode:Caching
filenamesoccuringmorethan10times
16/10/3113:47:30INFOutil.GSet:Computingcapacity
formapcachedBlocks
16/10/3113:47:30INFOutil.GSet:VMtype=64
bit
16/10/3113:47:30INFOutil.GSet:0.25%maxmemory
966.7MB=2.4MB
16/10/3113:47:30INFOutil.GSet:capacity=
2^18=262144entries
16/10/3113:47:30INFOnamenode.FSNamesystem:
dfs.namenode.safemode.thresholdpct=0.9990000128746033
16/10/3113:47:30INFOnamenode.FSNamesystem:
dfs.namenode.safemode.min.datanodes=0
16/10/3113:47:30INFOnamenode.FSNamesystem:
dfs.namenode.safemode.extension=30000
16/10/3113:47:30INFOmetrics.TopMetrics:NNTop
conf:dfs.namenode.top.window.num.buckets=10
16/10/3113:47:30INFOmetrics.TopMetrics:NNTop
conf:dfs.namenode.top.num.users=10
16/10/3113:47:30INFOmetrics.TopMetrics:NNTop
conf:dfs.namenode.top.windows.minutes=1,5,25
16/10/3113:47:30INFOnamenode.FSNamesystem:Retry
cacheonnamenodeisenabled
16/10/3113:47:30INFOnamenode.FSNamesystem:Retry
cachewilluse0.03oftotalheapandretrycacheentry
expirytimeis600000millis
16/10/3113:47:30INFOutil.GSet:Computingcapacity
formapNameNodeRetryCache
16/10/3113:47:30INFOutil.GSet:VMtype=64

53

bit
16/10/3113:47:30INFOutil.GSet:
0.029999999329447746%maxmemory966.7MB=297.0KB
16/10/3113:47:30INFOutil.GSet:capacity=
2^15=32768entries
16/10/3113:47:31INFOnamenode.FSImage:Allocated
newBlockPoolId:BP914027567127.0.0.11477896450852
16/10/3113:47:31INFOcommon.Storage:Storage
directory/usr/local/hadoop_tmp/hdfs/namenodehasbeen
successfullyformatted.
16/10/3113:47:31INFO
namenode.FSImageFormatProtobuf:Savingimagefile
/usr/local/hadoop_tmp/hdfs/namenode/current/fsimage.ckpt_00
00000000000000000usingnocompression
16/10/3113:47:32INFO
namenode.FSImageFormatProtobuf:Imagefile
/usr/local/hadoop_tmp/hdfs/namenode/current/fsimage.ckpt_00
00000000000000000ofsize352bytessavedin0seconds.
16/10/3113:47:32INFO
namenode.NNStorageRetentionManager:Goingtoretain1
imageswithtxid>=0
16/10/3113:47:32INFOutil.ExitUtil:Exitingwith
status0
16/10/3113:47:32INFOnamenode.NameNode:
SHUTDOWN_MSG:
/
***********************************************************
*
SHUTDOWN_MSG:ShuttingdownNameNodeat
master/127.0.0.1
*****************************************************
*******/

11. Membuat PC node1, Pc node2, dan PC node3, dengan cara


clone dari PC master

54

Gambar 7 clone dari PC master

Gambar 10 Klik next

55

Gambar 8 Klik Clone

Gambar 9 Semua PC node sudah siap


12. Start semua PC (master, node1, node2, node3)

56

Gambar 10 Klik Normal Start

Gambar 14 Semua PC sudah Running


13. Pada PC master, buka terminal, dan ketikkan nidos@master:~$
ifconfig

57

nidos@master:~$ifconfig
nidos@master:~$ifconfig
eth0Linkencap:EthernetHWaddr
08:00:27:77:9f:44
inetaddr:10.0.2.15Bcast:10.0.2.255
Mask:255.255.255.0
inet6addr:fe80::a00:27ff:fe77:9f44/64
Scope:Link
UPBROADCASTRUNNINGMULTICASTMTU:1500
Metric:1
RXpackets:4errors:0dropped:0overruns:0
frame:0
TXpackets:62errors:0dropped:0overruns:0
carrier:0
collisions:0txqueuelen:1000
RXbytes:1328(1.3KB)TXbytes:9454(9.4
KB)
loLinkencap:LocalLoopback
inetaddr:127.0.0.1Mask:255.0.0.0
inet6addr:::1/128Scope:Host
UPLOOPBACKRUNNINGMTU:65536Metric:1
RXpackets:150errors:0dropped:0
overruns:0frame:0
TXpackets:150errors:0dropped:0
overruns:0carrier:0
collisions:0txqueuelen:0
RXbytes:11577(11.5KB)TXbytes:11577
(11.5KB)

Akses Virtual Machine Ubuntu dari Window:


Install putty di Windows
Pada Ubuntu, lakukan hal berikut:
nidos@Master:~$ sudo apt-get install openssh-client
nidos@Master:~$ sudo apt-get install openssh-server

58

Ubah Wired connection 1 menjadi master

Pilih Method Manual dan Klik Add

59

Cek kembali yang anda isikan tadi, maka akan menjadi seperti
berikut

Cek kembali ifconfig-nya.


60

nidos@master:~$ifconfig
nidos@master:~$ifconfig
eth0Linkencap:EthernetHWaddr
08:00:27:77:9f:44
inetaddr:192.168.2.116
Bcast:192.168.2.255Mask:255.255.255.0
inet6addr:fe80::a00:27ff:fe77:9f44/64
Scope:Link
UPBROADCASTRUNNINGMULTICASTMTU:1500
Metric:1
RXpackets:9errors:0dropped:0overruns:0
frame:0
TXpackets:132errors:0dropped:0
overruns:0carrier:0
collisions:0txqueuelen:1000
RXbytes:1704(1.7KB)TXbytes:19263
(19.2KB)
loLinkencap:LocalLoopback
inetaddr:127.0.0.1Mask:255.0.0.0
inet6addr:::1/128Scope:Host
UPLOOPBACKRUNNINGMTU:65536Metric:1
RXpackets:270errors:0dropped:0
overruns:0frame:0
TXpackets:270errors:0dropped:0
overruns:0carrier:0
collisions:0txqueuelen:0
RXbytes:20953(20.9KB)TXbytes:20953
(20.9KB)

61

14. Set
file
master
padanidos@master:~$
/usr/local/hadoop/etc/hadoop/master

sudo

gedit

15. Buat folder hadoop_temp dengan nidos@master:~$ sudo


mkdir -p /usr/local/hadoop_tmp dan nidos@master:~$ sudo
mkdir -p /usr/local/hadoop_tmp/hdfs/namenode
Dan
Buatnamenode
nidos@master:~$sudomkdirp/usr/local/hadoop_tmp
nidos@master:~$sudomkdirp
/usr/local/hadoop_tmp/hdfs/namenode
nidos@master:~$sudochownRnidos
/usr/local/hadoop_tmp
nidos@master:~$

62

Kondisi awal:

Hasil:

63

16. Pada PC node1, login dan buka terminal

64

Kosongkan
nidos@node1:~$
/usr/local/hadoop/etc/hadoop/slaves

65

sudo

gedit

nidos@master:~$ifconfig
nidos@master:~$ifconfig
eth0Linkencap:EthernetHWaddr
08:00:27:77:9f:44
inetaddr:192.168.2.117
Bcast:192.168.2.255Mask:255.255.255.0
inet6addr:fe80::a00:27ff:fe77:9f44/64
Scope:Link
UPBROADCASTRUNNINGMULTICASTMTU:1500
Metric:1
RXpackets:6errors:0dropped:0overruns:0
frame:0
TXpackets:118errors:0dropped:0
overruns:0carrier:0
collisions:0txqueuelen:1000
RXbytes:1448(1.4KB)TXbytes:18121
(18.1KB)
loLinkencap:LocalLoopback
inetaddr:127.0.0.1Mask:255.0.0.0
inet6addr:::1/128Scope:Host

66

UPLOOPBACKRUNNINGMTU:65536Metric:1
RXpackets:234errors:0dropped:0
overruns:0frame:0
TXpackets:234errors:0dropped:0
overruns:0carrier:0
collisions:0txqueuelen:0
RXbytes:18457(18.4KB)TXbytes:18457
(18.4KB)

Lalu ketikkan
restart

nidos@master:~$

sudo

/etc/init.d/networking

17. Ketikkan
nidos@master:~$
sudo
-i
gedit
/usr/local/hadoop/etc/hadoop/hdfs-site.xml dan modifikasi isinya

Lalu restart PC node1


18. Lakukan hal di atas untuk PC node2 dan PC node3, seperti PC
node1.
19. Lalu pada, pada PC node 1, buat folder hadoop_tmp

Buatdatanode

67

nidos@node1:~$sudomkdirp/usr/local/hadoop_tmp
[sudo]passwordfornidos:
nidos@node1:~$sudomkdirp
/usr/local/hadoop_tmp/datanode
nidos@node1:~$sudochownRnidos
/usr/local/hadoop_tmp
nidos@node1:~$

Lalu, lakukan juga pada PC node2 dan PC node3

68

Вам также может понравиться