Rumah >pangkalan data >tutorial mysql >Hadoop入门--HDFS(单节点)配置和部署(一)
一 配置SSH 下载ssh服务端和客户端 sudo apt-get install openssh-server openssh-client 验证是否安装成功 ssh username@192.168.30.128按照提示输入username的密码,回车后显示以下,则成功。(此处不建议修改端口号,hadoop默认的是22,修改后启动hadoop会
二 安装JDK(采用OpenJDK,为啥不用JDK...百度or谷歌)
下载jdk sudo apt-get install openjdk-7-jdk(目前最新的是openjdk-7)配置环境变量 sudo vim ~/.bashrc (在文件末尾添加) export JAVA_HOME=/usr/lib/jvm/java-1.6.0-openjdk-i386hadoop<code class="bash plain">/conf/core-site<code class="bash plain">.xml,<code class="bash plain">hadoop<code class="bash plain">/conf/hdfs-site<code class="bash plain">.xml,<code class="bash plain">hadoop<code class="bash plain">/conf/mapred-site<code class="bash plain">.xml) sudo vim /usr/local/<code class="bash plain">hadoop<code class="bash plain">/conf/core-site<code class="bash plain">.xml(修改为以下内容) <?xml version="1.0"?><br>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?><br>
<br>
<!-- Put site-specific property overrides in this file. --><br>
<br>
<configuration><br>
<property><br>
<name>fs.default.name</name><br>
<value>hdfs://192.168.30.128:9000</value><br>
</property><br>
</configuration><br>
sudo vim /usr/local/<code class="bash plain">hadoop<code class="bash plain">/conf/hdfs-site<code class="bash plain">.xml(修改为以下内容) <?xml version="1.0"?><br>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?><br>
<br>
<!-- Put site-specific property overrides in this file. --><br>
<br>
<configuration><br>
<property><br>
<name>hadoop.tmp.dir</name><br>
<value>/home/username/hadoop_tmp</value><!--需要创建此目录--><br>
<description>A base for other temporary directories.</description><br>
</property><br>
<property><br>
<name>dfs.name.dir</name><br>
<value>/tmp/hadoop/dfs/datalog1,/tmp/hadoop/dfs/datalog2</value><br>
</property><br>
<property><br>
<name>dfs.data.dir</name><br>
<value>/tmp/hadoop/dfs/data1,/tmp/hadoop/dfs/data2</value><br>
</property><br>
<property><br>
<name>dfs.replication</name><br>
<value>2</value><br>
</property><br>
sudo vim /usr/local/<code class="bash plain">hadoop<code class="bash plain">/conf/mapred-site<code class="bash plain">.xml(修改为以下内容) <?xml version="1.0"?><br>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?><br>
<br>
<!-- Put site-specific property overrides in this file. --><br>
<br>
<configuration><br>
<property><br>
<name>mapred.job.tracker</name><br>
<value>192.168.30.128:9001</value><br>
</property><br>
</configuration>
<code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain">四 运行wordcount
<code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain"><code class="bash plain">在hdfs中创建一个统计目录,输出目录不用创建,否则运行wordcount的时候报错。 ./hadoop fs -mkdir /input./hadoop fs -put myword.txt /input./hadoop jar /usr/local/hadoop/hadoop-examples-1.2.1.jar wordcount /input /output./hadoop fs -cat <strong>/output/part-r-00000</strong>