Академический Документы
Профессиональный Документы
Культура Документы
start
start-pulseaudio-kde startx
startpar
start-pulseaudio-x11
startpar-upstart-inject start-stop-daemon
hduser@midarto-ThinkPad-Edge-E130:/usr/local/sbin$ cd
hduser@midarto-ThinkPad-Edge-E130:~$ cd /usr/local/hadoop
hadoop/
hadoop_store/
hduser@midarto-ThinkPad-Edge-E130:~$ cd /usr/local/hadoop/
bin/ etc/ include/ lib/ libexec/ sbin/ share/
hduser@midarto-ThinkPad-Edge-E130:~$ cd /usr/local/hadoop/sbin/
hduser@midarto-ThinkPad-Edge-E130:/usr/local/hadoop/sbin$ ls
distribute-exclude.sh start-all.cmd
stop-balancer.sh
hadoop-daemon.sh
start-all.sh
stop-dfs.cmd
hadoop-daemons.sh
start-balancer.sh stop-dfs.sh
hdfs-config.cmd
start-dfs.cmd
stop-secure-dns.sh
hdfs-config.sh
start-dfs.sh
stop-yarn.cmd
httpfs.sh
start-secure-dns.sh stop-yarn.sh
kms.sh
start-yarn.cmd
yarn-daemon.sh
mr-jobhistory-daemon.sh start-yarn.sh
yarn-daemons.sh
refresh-namenodes.sh stop-all.cmd
slaves.sh
stop-all.sh
hduser@midarto-ThinkPad-Edge-E130:/usr/local/hadoop/sbin$ start-all.sh
bash: /usr/local/hadoop/sbin/start-all.sh: Permission denied
hduser@midarto-ThinkPad-Edge-E130:/usr/local/hadoop/sbin$ cd
hduser@midarto-ThinkPad-Edge-E130:~$ sudo chmod -R 777 /usr/local/hadoop/sbin/
hduser@midarto-ThinkPad-Edge-E130:~$ cd /usr/local/hadoop/sbin/
hduser@midarto-ThinkPad-Edge-E130:/usr/local/hadoop/sbin$ str
strace strings strip
hduser@midarto-ThinkPad-Edge-E130:/usr/local/hadoop/sbin$ start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
15/06/15 15:26:59 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-hduser-namenode-midartoThinkPad-Edge-E130.out
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hduser-datanode-midartoThinkPad-Edge-E130.out
Starting secondary namenodes [0.0.0.0]
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
ECDSA key fingerprint is d0:33:ed:28:d4:55:e7:f0:32:e8:26:be:92:07:fe:fa.
Are you sure you want to continue connecting (yes/no)? yes
0.0.0.0: Warning: Permanently added '0.0.0.0' (ECDSA) to the list of known hosts.
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-hdusersecondarynamenode-midarto-ThinkPad-Edge-E130.out
15/06/15 15:30:28 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
starting yarn daemons
starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-hduser-resourcemanager-midartoThinkPad-Edge-E130.out
localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hduser-nodemanagermidarto-ThinkPad-Edge-E130.out
hduser@midarto-ThinkPad-Edge-E130:/usr/local/hadoop/sbin$ jps
4053 NodeManager
3376 DataNode
3724 ResourceManager
3576 SecondaryNameNode
3215 NameNode
4156 Jps
hduser@midarto-ThinkPad-Edge-E130:/usr/local/hadoop/sbin$ cd
hduser@midarto-ThinkPad-Edge-E130:~$ hadoop jar /usr/local/hadoop/
bin/
include/ libexec/ logs/
README.txt share/
etc/
lib/
LICENSE.txt NOTICE.txt sbin/
hduser@midarto-ThinkPad-Edge-E130:~$ hadoop jar /usr/local/hadoop/share/hadoop/
common/ hdfs/
httpfs/ kms/
mapreduce/ tools/ yarn/
hduser@midarto-ThinkPad-Edge-E130:~$ hadoop jar /usr/local/hadoop/share/hadoop/mapreduce/
hadoop-mapreduce-client-app-2.7.0.jar
hadoop-mapreduce-client-common-2.7.0.jar
hadoop-mapreduce-client-core-2.7.0.jar
hadoop-mapreduce-client-hs-2.7.0.jar
hadoop-mapreduce-client-hs-plugins-2.7.0.jar
hadoop-mapreduce-client-jobclient-2.7.0.jar
hadoop-mapreduce-client-jobclient-2.7.0-tests.jar
hadoop-mapreduce-client-shuffle-2.7.0.jar
hadoop-mapreduce-examples-2.7.0.jar
lib/
lib-examples/
sources/
hduser@midarto-ThinkPad-Edge-E130:~$ hadoop jar
/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.0.jar pi 2 5
Number of Maps = 2
Samples per Map = 5
15/06/15 15:32:29 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
Wrote input for Map #0
Wrote input for Map #1
Starting Job
15/06/15 15:32:32 INFO Configuration.deprecation: session.id is deprecated. Instead, use
dfs.metrics.session-id
15/06/15 15:32:32 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker,
sessionId=
15/06/15 15:32:32 INFO input.FileInputFormat: Total input paths to process : 2
15/06/15 15:32:32 INFO mapreduce.JobSubmitter: number of splits:2
15/06/15 15:32:33 INFO mapreduce.JobSubmitter: Submitting tokens for job:
job_local1644526633_0001
15/06/15 15:32:33 INFO mapreduce.Job: The url to track the job: http://localhost:8080/
15/06/15 15:32:33 INFO mapreduce.Job: Running job: job_local1644526633_0001
15/06/15 15:32:33 INFO mapred.LocalJobRunner: OutputCommitter set in config null
15/06/15 15:32:33 INFO output.FileOutputCommitter: File Output Committer Algorithm version is
1
15/06/15 15:32:33 INFO mapred.LocalJobRunner: OutputCommitter is
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
15/06/15 15:32:33 INFO mapred.LocalJobRunner: Waiting for map tasks
15/06/15 15:32:33 INFO mapred.LocalJobRunner: Starting task:
attempt_local1644526633_0001_m_000000_0
15/06/15 15:32:33 INFO output.FileOutputCommitter: File Output Committer Algorithm version is
1
15/06/15 15:32:33 INFO mapred.Task: Using ResourceCalculatorProcessTree : [ ]
15/06/15 15:32:33 INFO mapred.MapTask: Processing split:
hdfs://localhost:54310/user/hduser/QuasiMonteCarlo_1434353547302_418935020/in/part0:0+118
15/06/15 15:32:33 INFO mapred.MapTask: (EQUATOR) 0 kvi 26214396(104857584)
15/06/15 15:32:33 INFO mapred.MapTask: mapreduce.task.io.sort.mb: 100
15/06/15 15:32:33 INFO mapred.MapTask: soft limit at 83886080
15/06/15 15:32:33 INFO mapred.MapTask: bufstart = 0; bufvoid = 104857600
15/06/15 15:32:33 INFO mapred.MapTask: kvstart = 26214396; length = 6553600
15/06/15 15:32:33 INFO mapred.MapTask: Map output collector class =
org.apache.hadoop.mapred.MapTask$MapOutputBuffer
15/06/15 15:32:34 INFO mapred.LocalJobRunner:
15/06/15 15:32:34 INFO mapred.MapTask: Starting flush of map output
15/06/15 15:32:34 INFO mapred.MapTask: Spilling map output
15/06/15 15:32:34 INFO mapred.MapTask: bufstart = 0; bufend = 18; bufvoid = 104857600
15/06/15 15:32:34 INFO mapred.MapTask: kvstart = 26214396(104857584); kvend =
26214392(104857568); length = 5/6553600
15/06/15 15:32:34 INFO mapred.MapTask: Finished spill 0
15/06/15 15:32:34 INFO mapred.Task: Task:attempt_local1644526633_0001_m_000000_0 is done.
And is in the process of committing
15/06/15 15:32:34 INFO mapred.LocalJobRunner: map
15/06/15 15:32:34 INFO mapred.Task: Task 'attempt_local1644526633_0001_m_000000_0' done.
15/06/15 15:32:34 INFO mapred.LocalJobRunner: Finishing task:
attempt_local1644526633_0001_m_000000_0
15/06/15 15:32:34 INFO mapred.LocalJobRunner: Starting task:
attempt_local1644526633_0001_m_000001_0
15/06/15 15:32:34 INFO output.FileOutputCommitter: File Output Committer Algorithm version is
1
15/06/15 15:32:34 INFO mapred.Task: Using ResourceCalculatorProcessTree : [ ]
15/06/15 15:32:34 INFO mapred.MapTask: Processing split:
hdfs://localhost:54310/user/hduser/QuasiMonteCarlo_1434353547302_418935020/in/part1:0+118
15/06/15 15:32:34 INFO mapred.MapTask: (EQUATOR) 0 kvi 26214396(104857584)
15/06/15 15:32:34 INFO mapred.MapTask: mapreduce.task.io.sort.mb: 100
15/06/15 15:32:34 INFO mapred.MapTask: soft limit at 83886080
15/06/15 15:32:34 INFO mapred.MapTask: bufstart = 0; bufvoid = 104857600
15/06/15 15:32:34 INFO mapred.MapTask: kvstart = 26214396; length = 6553600
15/06/15 15:32:34 INFO mapred.MapTask: Map output collector class =
org.apache.hadoop.mapred.MapTask$MapOutputBuffer
15/06/15 15:32:34 INFO mapred.LocalJobRunner:
15/06/15 15:32:34 INFO mapred.MapTask: Starting flush of map output
15/06/15 15:32:34 INFO mapred.MapTask: Spilling map output
15/06/15 15:32:34 INFO mapred.MapTask: bufstart = 0; bufend = 18; bufvoid = 104857600
15/06/15 15:32:34 INFO mapred.MapTask: kvstart = 26214396(104857584); kvend =
26214392(104857568); length = 5/6553600
15/06/15 15:32:34 INFO mapred.MapTask: Finished spill 0
15/06/15 15:32:34 INFO mapred.Task: Task:attempt_local1644526633_0001_m_000001_0 is done.
And is in the process of committing
15/06/15 15:32:34 INFO mapred.LocalJobRunner: map
15/06/15 15:32:34 INFO mapred.Task: Task 'attempt_local1644526633_0001_m_000001_0' done.
15/06/15 15:32:34 INFO mapred.LocalJobRunner: Finishing task:
attempt_local1644526633_0001_m_000001_0
15/06/15 15:32:34 INFO mapred.LocalJobRunner: map task executor complete.
'attempt_local1644526633_0001_r_000000_0' to
hdfs://localhost:54310/user/hduser/QuasiMonteCarlo_1434353547302_418935020/out/_temporary/
0/task_local1644526633_0001_r_000000
15/06/15 15:32:35 INFO mapred.LocalJobRunner: reduce > reduce
15/06/15 15:32:35 INFO mapred.Task: Task 'attempt_local1644526633_0001_r_000000_0' done.
15/06/15 15:32:35 INFO mapred.LocalJobRunner: Finishing task:
attempt_local1644526633_0001_r_000000_0
15/06/15 15:32:35 INFO mapred.LocalJobRunner: reduce task executor complete.
15/06/15 15:32:35 INFO mapreduce.Job: map 100% reduce 100%
15/06/15 15:32:35 INFO mapreduce.Job: Job job_local1644526633_0001 completed successfully
15/06/15 15:32:35 INFO mapreduce.Job: Counters: 35
File System Counters
FILE: Number of bytes read=822302
FILE: Number of bytes written=1648559
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=590
HDFS: Number of bytes written=923
HDFS: Number of read operations=30
HDFS: Number of large read operations=0
HDFS: Number of write operations=15
Map-Reduce Framework
Map input records=2
Map output records=4
Map output bytes=36
Map output materialized bytes=56
Input split bytes=296
Combine input records=0
Combine output records=0
Reduce input groups=2
Reduce shuffle bytes=56
Reduce input records=4
Reduce output records=0
Spilled Records=8
Shuffled Maps =2
Failed Shuffles=0
Merged Map outputs=2
GC time elapsed (ms)=0
Total committed heap usage (bytes)=854065152
Shuffle Errors
BAD_ID=0
CONNECTION=0
IO_ERROR=0
WRONG_LENGTH=0
WRONG_MAP=0
WRONG_REDUCE=0
File Input Format Counters
Bytes Read=236
File Output Format Counters
Bytes Written=97
Job Finished in 3.39 seconds
org.apache.hadoop.mapred.MapTask$MapOutputBuffer
15/06/15 15:35:28 INFO mapred.LocalJobRunner:
15/06/15 15:35:28 INFO mapred.MapTask: Starting flush of map output
15/06/15 15:35:28 INFO mapred.MapTask: Spilling map output
15/06/15 15:35:28 INFO mapred.MapTask: bufstart = 0; bufend = 53; bufvoid = 104857600
15/06/15 15:35:28 INFO mapred.MapTask: kvstart = 26214396(104857584); kvend =
26214384(104857536); length = 13/6553600
15/06/15 15:35:28 INFO mapred.MapTask: Finished spill 0
15/06/15 15:35:28 INFO mapred.Task: Task:attempt_local1075455800_0001_m_000000_0 is done.
And is in the process of committing
15/06/15 15:35:28 INFO mapred.LocalJobRunner: map
15/06/15 15:35:28 INFO mapred.Task: Task 'attempt_local1075455800_0001_m_000000_0' done.
15/06/15 15:35:28 INFO mapred.LocalJobRunner: Finishing task:
attempt_local1075455800_0001_m_000000_0
15/06/15 15:35:28 INFO mapred.LocalJobRunner: map task executor complete.
15/06/15 15:35:28 INFO mapred.LocalJobRunner: Waiting for reduce tasks
15/06/15 15:35:28 INFO mapred.LocalJobRunner: Starting task:
attempt_local1075455800_0001_r_000000_0
15/06/15 15:35:28 INFO output.FileOutputCommitter: File Output Committer Algorithm version is
1
15/06/15 15:35:28 INFO mapred.Task: Using ResourceCalculatorProcessTree : [ ]
15/06/15 15:35:28 INFO mapred.ReduceTask: Using ShuffleConsumerPlugin:
org.apache.hadoop.mapreduce.task.reduce.Shuffle@7d27a2b6
15/06/15 15:35:28 INFO reduce.MergeManagerImpl: MergerManager: memoryLimit=333971456,
maxSingleShuffleLimit=83492864, mergeThreshold=220421168, ioSortFactor=10,
memToMemMergeOutputsThreshold=10
15/06/15 15:35:28 INFO reduce.EventFetcher: attempt_local1075455800_0001_r_000000_0
Thread started: EventFetcher for fetching Map Completion Events
15/06/15 15:35:28 INFO reduce.LocalFetcher: localfetcher#1 about to shuffle output of map
attempt_local1075455800_0001_m_000000_0 decomp: 63 len: 67 to MEMORY
15/06/15 15:35:28 INFO reduce.InMemoryMapOutput: Read 63 bytes from map-output for
attempt_local1075455800_0001_m_000000_0
15/06/15 15:35:28 INFO reduce.MergeManagerImpl: closeInMemoryFile -> map-output of size:
63, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->63
15/06/15 15:35:28 INFO reduce.EventFetcher: EventFetcher is interrupted.. Returning
15/06/15 15:35:28 INFO mapred.LocalJobRunner: 1 / 1 copied.
15/06/15 15:35:28 INFO reduce.MergeManagerImpl: finalMerge called with 1 in-memory mapoutputs and 0 on-disk map-outputs
15/06/15 15:35:28 INFO mapred.Merger: Merging 1 sorted segments
15/06/15 15:35:28 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total
size: 53 bytes
15/06/15 15:35:28 INFO reduce.MergeManagerImpl: Merged 1 segments, 63 bytes to disk to
satisfy reduce memory limit
15/06/15 15:35:28 INFO reduce.MergeManagerImpl: Merging 1 files, 67 bytes from disk
15/06/15 15:35:28 INFO reduce.MergeManagerImpl: Merging 0 segments, 0 bytes from memory
into reduce
15/06/15 15:35:28 INFO mapred.Merger: Merging 1 sorted segments
15/06/15 15:35:28 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total
size: 53 bytes
15/06/15 15:35:28 INFO mapred.LocalJobRunner: 1 / 1 copied.
15/06/15 15:35:29 INFO Configuration.deprecation: mapred.skip.on is deprecated. Instead, use
mapreduce.job.skiprecords
WRONG_MAP=0
WRONG_REDUCE=0
File Input Format Counters
Bytes Read=37
File Output Format Counters
Bytes Written=45
hduser@midarto-ThinkPad-Edge-E130:~/coba$ hdfs dfs -ls
15/06/15 15:35:41 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
Found 2 items
drwxr-xr-x - hduser supergroup
0 2015-06-15 15:34 coba
drwxr-xr-x - hduser supergroup
0 2015-06-15 15:35 coba-out
hduser@midarto-ThinkPad-Edge-E130:~/coba$ hdfs dfs -ls coba-out
15/06/15 15:36:05 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
Found 2 items
-rw-r--r-- 1 hduser supergroup
0 2015-06-15 15:35 coba-out/_SUCCESS
-rw-r--r-- 1 hduser supergroup
45 2015-06-15 15:35 coba-out/part-r-00000
hduser@midarto-ThinkPad-Edge-E130:~/coba$ hdfs dfs -cat coba-out/part-r-00000
15/06/15 15:36:34 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
Elektro1
Hasanuddi
1
Teknik 1
Universitas 1
hduser@midarto-ThinkPad-Edge-E130:~/coba$ hdfs dfs -ls^C
hduser@midarto-ThinkPad-Edge-E130:~/coba$