# cd /var/tmp
# sh jdk-6u22-linux-x64-rpm.bin
Unpacking...
Checksumming...
Extracting...
UnZipSFX 5.50 of 17 February 2002, by Info-ZIP (Zip-Bugs@lists.wku.edu).
inflating: jdk-6u22-linux-amd64.rpm
inflating: sun-javadb-common-10.5.3-0.2.i386.rpm
inflating: sun-javadb-core-10.5.3-0.2.i386.rpm
inflating: sun-javadb-client-10.5.3-0.2.i386.rpm
inflating: sun-javadb-demo-10.5.3-0.2.i386.rpm
inflating: sun-javadb-docs-10.5.3-0.2.i386.rpm
inflating: sun-javadb-javadoc-10.5.3-0.2.i386.rpm
準備中... ########################################### [100%]
1:jdk ########################################### [100%]
Unpacking JAR files...
rt.jar...
jsse.jar...
charsets.jar...
tools.jar...
localedata.jar...
plugin.jar...
javaws.jar...
deploy.jar...
Installing JavaDB
準備中... ########################################### [100%]
1:sun-javadb-common ########################################### [ 17%]
2:sun-javadb-core ########################################### [ 33%]
3:sun-javadb-client ########################################### [ 50%]
4:sun-javadb-demo ########################################### [ 67%]
5:sun-javadb-docs ########################################### [ 83%]
6:sun-javadb-javadoc ########################################### [100%]
Java(TM) SE Development Kit 6 successfully installed.
Product Registration is FREE and includes many benefits:
* Notification of new versions, patches, and updates
* Special offers on Sun products, services and training
* Access to early releases and documentation
Product and system data will be collected. If your configuration
supports a browser, the Sun Product Registration form for
the JDK will be presented. If you do not register, none of
this information will be saved. You may also register your
JDK later by opening the register.html file (located in
the JDK installation directory) in a browser.
For more information on what data Registration collects and
how it is managed and used, see:
http://java.sun.com/javase/registration/JDKRegistrationPrivacy.html
Press Enter to continue.....
Enterを押す
Done.
# rehash
インストールの確認
# java -version
java version "1.6.0_22"
Java(TM) SE Runtime Environment (build 1.6.0_22-b04)
Java HotSpot(TM) 64-Bit Server VM (build 17.1-b03, mixed mode)
Hadoopインストール
Clouderaのyumリポジトリを設定
# curl http://archive.cloudera.com/redhat/cdh/cloudera-cdh3.repo > /etc/yum.repos.d/cloudera-cdh3.repo
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 211 100 211 0 0 234 0 --:--:-- --:--:-- --:--:-- 0
yumの更新
# yum -y update yum
Loaded plugins: fastestmirror
Loading mirror speeds from cached hostfile
* addons: ftp.iij.ad.jp
* base: ftp.iij.ad.jp
* extras: ftp.iij.ad.jp
* updates: ftp.iij.ad.jp
addons | 951 B 00:00
base | 2.1 kB 00:00
cloudera-cdh3 | 951 B 00:00
cloudera-cdh3/primary | 19 kB 00:00
cloudera-cdh3 68/68
extras | 2.1 kB 00:00
updates | 1.9 kB 00:00
Setting up Update Process
No Packages marked for Update
Hadoopパッケージをインストール
# yum -y install hadoop-0.20-conf-pseudo
Loaded plugins: fastestmirror
Loading mirror speeds from cached hostfile
* addons: ftp.iij.ad.jp
* base: ftp.iij.ad.jp
* extras: ftp.iij.ad.jp
* updates: ftp.iij.ad.jp
Setting up Install Process
Resolving Dependencies
-->Running transaction check
---> Package hadoop-0.20-conf-pseudo.noarch 0:0.20.2+737-1 set to be updated
--> Processing Dependency: hadoop-0.20-namenode = 0.20.2+737 for package: hadoop-0.20-conf-pseudo
--> Processing Dependency: hadoop-0.20-secondarynamenode = 0.20.2+737 for package: hadoop-0.20-conf-pseudo
--> Processing Dependency: hadoop-0.20-tasktracker = 0.20.2+737 for package: hadoop-0.20-conf-pseudo
--> Processing Dependency: hadoop-0.20 = 0.20.2+737 for package: hadoop-0.20-conf-pseudo
--> Processing Dependency: hadoop-0.20-jobtracker = 0.20.2+737 for package: hadoop-0.20-conf-pseudo
--> Processing Dependency: hadoop-0.20-datanode = 0.20.2+737 for package: hadoop-0.20-conf-pseudo
--> Running transaction check
---> Package hadoop-0.20.noarch 0:0.20.2+737-1 set to be updated
---> Package hadoop-0.20-datanode.noarch 0:0.20.2+737-1 set to be updated
---> Package hadoop-0.20-jobtracker.noarch 0:0.20.2+737-1 set to be updated
---> Package hadoop-0.20-namenode.noarch 0:0.20.2+737-1 set to be updated
---> Package hadoop-0.20-secondarynamenode.noarch 0:0.20.2+737-1 set to be updated
---> Package hadoop-0.20-tasktracker.noarch 0:0.20.2+737-1 set to be updated
--> Finished Dependency Resolution
Dependencies Resolved
============================================================================================================
Package Arch Version Repository Size
============================================================================================================
Installing:
hadoop-0.20-conf-pseudo noarch 0.20.2+737-1 cloudera-cdh3 11 k
Installing for dependencies:
hadoop-0.20 noarch 0.20.2+737-1 cloudera-cdh3 37 M
hadoop-0.20-datanode noarch 0.20.2+737-1 cloudera-cdh3 4.3 k
hadoop-0.20-jobtracker noarch 0.20.2+737-1 cloudera-cdh3 4.4 k
hadoop-0.20-namenode noarch 0.20.2+737-1 cloudera-cdh3 4.4 k
hadoop-0.20-secondarynamenode noarch 0.20.2+737-1 cloudera-cdh3 4.4 k
hadoop-0.20-tasktracker noarch 0.20.2+737-1 cloudera-cdh3 4.4 k
Transaction Summary
============================================================================================================
Install 7 Package(s)
Upgrade 0 Package(s)
Total download size: 37 M
Downloading Packages:
(1/7): hadoop-0.20-datanode-0.20.2+737-1.noarch.rpm | 4.3 kB 00:00
(2/7): hadoop-0.20-tasktracker-0.20.2+737-1.noarch.rpm | 4.4 kB 00:00
(3/7): hadoop-0.20-namenode-0.20.2+737-1.noarch.rpm | 4.4 kB 00:00
(4/7): hadoop-0.20-secondarynamenode-0.20.2+737-1.noarch.rpm | 4.4 kB 00:00
(5/7): hadoop-0.20-jobtracker-0.20.2+737-1.noarch.rpm | 4.4 kB 00:00
(6/7): hadoop-0.20-conf-pseudo-0.20.2+737-1.noarch.rpm | 11 kB 00:00
(7/7): hadoop-0.20-0.20.2+737-1.noar (77%) 77% [==================== ] 1.8 MB/s | 29 MB 00:04 ETA (7/7): hadoop-0.20-0.20.2+737-1.noar (82%) 82% [=====================- ] 1.8 MB/s | 31 MB 00:03 ETA (7/7): hadoop-0.20-0.20.2+737-1.noar (85%) 85% [====================== ] 1.8 MB/s | 32 MB 00:02 ETA (7/7): hadoop-0.20-0.20.2+737-1.noar (87%) 87% [======================- ] 1.8 MB/s | 33 MB 00:02 ETA (7/7): hadoop-0.20-0.20.2+737-1.noarch.rpm | 37 MB 00:21
------------------------------------------------------------------------------------------------------------
Total 1.5 MB/s | 37 MB 00:24
Running rpm_check_debug
Running Transaction Test
Finished Transaction Test
Transaction Test Succeeded
Running Transaction
Installing : hadoop-0.20 1/7
Installing : hadoop-0.20-datanode 2/7
Installing : hadoop-0.20-namenode 3/7
Installing : hadoop-0.20-jobtracker 4/7
Installing : hadoop-0.20-secondarynamenode 5/7
Installing : hadoop-0.20-tasktracker 6/7
Installing : hadoop-0.20-conf-pseudo 7/7
Installed:
hadoop-0.20-conf-pseudo.noarch 0:0.20.2+737-1
Dependency Installed:
hadoop-0.20.noarch 0:0.20.2+737-1 hadoop-0.20-datanode.noarch 0:0.20.2+737-1
hadoop-0.20-jobtracker.noarch 0:0.20.2+737-1 hadoop-0.20-namenode.noarch 0:0.20.2+737-1
hadoop-0.20-secondarynamenode.noarch 0:0.20.2+737-1 hadoop-0.20-tasktracker.noarch 0:0.20.2+737-1
Complete!
# hadoop jar /usr/lib/hadoop/hadoop-*-examples.jar pi 2 100000
Number of Maps = 2
Samples per Map = 100000
Wrote input for Map #0
Wrote input for Map #1
Starting Job
10/11/14 16:40:24 INFO mapred.FileInputFormat: Total input paths to process : 2
10/11/14 16:40:25 INFO mapred.JobClient: Running job: job_201011031631_0001
10/11/14 16:40:26 INFO mapred.JobClient: map 0% reduce 0%
10/11/14 16:40:45 INFO mapred.JobClient: map 50% reduce 0%
10/11/14 16:40:49 INFO mapred.JobClient: map 100% reduce 0%
10/11/14 16:41:08 INFO mapred.JobClient: map 100% reduce 100%
10/11/14 16:41:11 INFO mapred.JobClient: Job complete: job_201011031631_0001
10/11/14 16:41:11 INFO mapred.JobClient: Counters: 23
10/11/14 16:41:11 INFO mapred.JobClient: Job Counters
10/11/14 16:41:11 INFO mapred.JobClient: Launched reduce tasks=1
10/11/14 16:41:11 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=35030
10/11/14 16:41:11 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0
10/11/14 16:41:11 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0
10/11/14 16:41:11 INFO mapred.JobClient: Launched map tasks=2
10/11/14 16:41:11 INFO mapred.JobClient: Data-local map tasks=2
10/11/14 16:41:11 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=20012
10/11/14 16:41:11 INFO mapred.JobClient: FileSystemCounters
10/11/14 16:41:11 INFO mapred.JobClient: FILE_BYTES_READ=50
10/11/14 16:41:11 INFO mapred.JobClient: HDFS_BYTES_READ=468
10/11/14 16:41:11 INFO mapred.JobClient: FILE_BYTES_WRITTEN=170
10/11/14 16:41:11 INFO mapred.JobClient: HDFS_BYTES_WRITTEN=215
10/11/14 16:41:11 INFO mapred.JobClient: Map-Reduce Framework
10/11/14 16:41:11 INFO mapred.JobClient: Reduce input groups=2
10/11/14 16:41:11 INFO mapred.JobClient: Combine output records=0
10/11/14 16:41:11 INFO mapred.JobClient: Map input records=2
10/11/14 16:41:11 INFO mapred.JobClient: Reduce shuffle bytes=56
10/11/14 16:41:11 INFO mapred.JobClient: Reduce output records=0
10/11/14 16:41:11 INFO mapred.JobClient: Spilled Records=8
10/11/14 16:41:11 INFO mapred.JobClient: Map output bytes=36
10/11/14 16:41:11 INFO mapred.JobClient: Map input bytes=48
10/11/14 16:41:11 INFO mapred.JobClient: Combine input records=0
10/11/14 16:41:11 INFO mapred.JobClient: Map output records=4
10/11/14 16:41:11 INFO mapred.JobClient: SPLIT_RAW_BYTES=232
10/11/14 16:41:11 INFO mapred.JobClient: Reduce input records=4
Job Finished in 46.664 seconds
Estimated value of Pi is 3.14118000000000000000