站長資訊網(wǎng)
最全最豐富的資訊網(wǎng)站

Hadoop2.5.0偽分布式環(huán)境搭建

本章主要介紹下在Linux系統(tǒng)下的Hadoop2.5.0偽分布式環(huán)境搭建步驟。首先要搭建Hadoop偽分布式環(huán)境,需要完成一些前置依賴工作,包括創(chuàng)建用戶、安裝JDK、關(guān)閉防火墻等。

一、創(chuàng)建hadoop用戶

使用root賬戶創(chuàng)建hadoop用戶,為了在實驗環(huán)境下便于操作,賦予hadoop用戶sudo權(quán)限。具體操作代碼如下:

useradd hadoop # 添加hadoop用戶
passwd hadoop # 設(shè)置密碼
visudo
hadoop ALL=(root)NOPASSWD:ALL

二、Hadoop偽分布式環(huán)境搭建

1、關(guān)閉Linux中的防火墻和selinux

禁用selinux,代碼如下:

sudo vi /etc/sysconfig/selinux # 打開selinux配置文件
SELINUX=disabled # 修改SELINUX屬性值為disabled

關(guān)閉防火墻,代碼如下:

sudo service iptables status # 查看防火墻狀態(tài)
sudo service iptables stop # 關(guān)閉防火墻
sudo chkconfig iptables off # 關(guān)閉防火墻開機啟動設(shè)置

2、安裝jdk

首先,查看系統(tǒng)中是否有安裝自帶的jdk,如果存在,則先卸載,代碼如下:

rpm -qa | grep java # 查看是否有安裝jdk
sudo rpm -e –nodeps java-1.6.0-openjdk-1.6.0.0-1.50.1.11.5.el6_3.x86_64 tzdata-java-2012j-1.el6.noarch java-1.7.0-openjdk-1.7.0.9-2.3.4.1.el6_3.x86_64 # 卸載自帶jdk

接著,安裝jdk,步驟如下:

step1.解壓安裝包:

tar -zxf jdk-7u67-linux-x64.tar.gz -C /usr/local/

step2.配置環(huán)境變量及檢查是否安裝成功:

sudo vi /etc/profile # 打開profile文件
##JAVA_HOME
export JAVA_HOME=/usr/local/jdk1.7.0_67
export PATH=$PATH:$JAVA_HOME/bin

# 生效文件
source /etc/profile # 使用root用戶操作

# 查看是否配置成功
java -version

3、安裝hadoop

step1:解壓hadoop安裝包

tar -zxvf /opt/software/hadoop-2.5.0.tar.gz -C /opt/software/

建議:將/opt/software/hadoop-2.5.0/share下的doc目錄刪除。

step2:修改etc/hadoop目錄下hadoop-env.sh、mapred-env.sh、yarn-env.sh三個配置文件中的JAVA_HOME

export JAVA_HOME=/usr/local/jdk1.7.0_67

step3:修改core-site.xml

<?xml version=”1.0″ encoding=”UTF-8″?>
<?xml-stylesheet type=”text/xsl” href=”configuration.xsl”?>
<!–
  Licensed under the Apache License, Version 2.0 (the “License”);
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an “AS IS” BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
–>

<!– Put site-specific property overrides in this file. –>

<configuration>
    <property>
        <name>name</name>
        <value>my-study-cluster</value>
    </property>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://bigdata01:8020</value>
    </property>
        <!– 指定Hadoop系統(tǒng)生成文件的臨時目錄地址 –>
    <property>
        <name>hadoop.tmp.dir</name>
        <value>/opt/software/hadoop-2.5.0/data/tmp</value>
    </property>
    <property>
        <name>fs.trash.interval</name>
        <value>1440</value>
    </property>
    <property>
        <name>hadoop.http.staticuser.user</name>
        <value>hadoop</value>
    </property>
        <property>
                <name>hadoop.proxyuser.hadoop.hosts</name>
                <value>bigdata01</value>
        </property>
        <property>
                <name>hadoop.proxyuser.hadoop.groups</name>
                <value>*</value>
        </property>
</configuration>

step4:修改hdfs-site.xml

<?xml version=”1.0″ encoding=”UTF-8″?>
<?xml-stylesheet type=”text/xsl” href=”configuration.xsl”?>
<!–
  Licensed under the Apache License, Version 2.0 (the “License”);
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an “AS IS” BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
–>

<!– Put site-specific property overrides in this file. –>

<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
    <property>
        <name>dfs.permissions.enabled</name>
        <value>false</value>
    </property>
    <property>
        <name>dfs.namenode.name.dir</name>
        <value>/opt/software/hadoop-2.5.0/data/name</value>
    </property>
    <property>
        <name>dfs.datanode.data.dir</name>
        <value>/opt/software/hadoop-2.5.0/data/data</value>
    </property>
</configuration>

step5:修改mapred-site.xml

<?xml version=”1.0″?>
<?xml-stylesheet type=”text/xsl” href=”configuration.xsl”?>
<!–
  Licensed under the Apache License, Version 2.0 (the “License”);
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an “AS IS” BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
–>

<!– Put site-specific property overrides in this file. –>

<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
    <property>
        <name>mapreduce.jobhistory.address</name>
        <value>bigdata01:10020</value>
    </property>
    <property>
        <name>mapreduce.jobhistory.webapp.address</name>
        <value>bigdata01:19888</value>
    </property>
</configuration>

step6:修改yarn-site.xml

<?xml version=”1.0″?>
<!–
  Licensed under the Apache License, Version 2.0 (the “License”);
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an “AS IS” BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
–>
<configuration>

<!– Site specific YARN configuration properties –>

    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
    <property>
        <name>yarn.resourcemanager.hostname</name>
        <value>bigdata01</value>
    </property>
    <property>
        <name>yarn.log-aggregation-enable</name>
        <value>true</value>
    </property>
    <property>
        <name>yarn.log-aggregation.retain-seconds</name>
        <value>106800</value>
    </property>
    <property>
        <name>yarn.log.server.url</name>
        <value>http://bigdata01:19888/jobhistory/job/</value>
    </property>
</configuration>

step7:修改slaves文件

bigdata01

step8:格式化namenode

bin/hdfs namenode -format

step9:啟動進程

## 方式一:單獨啟動一個進程
# 啟動namenode
sbin/hadoop-daemon.sh start namenode
# 啟動datanode
sbin/hadoop-daemon.sh start datanode
# 啟動resourcemanager
sbin/yarn-daemon.sh start resourcemanager
# 啟動nodemanager
sbin/yarn-daemon.sh start nodemanager
# 啟動secondarynamenode
sbin/hadoop-daemon.sh start secondarynamenode
# 啟動歷史服務(wù)器
sbin/mr-jobhistory-daemon.sh start historyserver

## 方式二:
sbin/start-dfs.sh # 啟動namenode、datanode、secondarynamenode
sbin/start-yarn.sh # 啟動resourcemanager、nodemanager
sbin/mr-jobhistory-daemon.sh start historyserver # 啟動歷史服務(wù)器

step10:檢查

1.通過瀏覽器訪問HDFS的外部UI界面,加上外部交互端口號:50070

  http://bigdata01:50070

2.通過瀏覽器訪問YARN的外部UI界面,加上外部交互端口號:8088

  http://bigdata01:8088

3.執(zhí)行Wordcount程序

  bin/yarn jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.5.0.jar wordcount input output

  注:輸入輸出目錄自定義

結(jié)束!

以上為Hadoop2.5.0偽分布式環(huán)境搭建步驟,如有問題,請指出,謝謝!

贊(0)
分享到: 更多 (0)
網(wǎng)站地圖   滬ICP備18035694號-2    滬公網(wǎng)安備31011702889846號
91精品免费久久久久久久久| 久久精品国产大片免费观看| 国产午夜亚洲精品不卡| 99RE6热在线精品视频观看| 99re热精品视频国产免费| 久久成人影院精品777| 老司机亚洲精品影视www| 日韩三级电影视频| 国产99视频精品免费视频7| 欧洲精品免费一区二区三区| 久久久久久久91精品免费观看| 91精品观看91久久久久久| 久久国产乱子伦免费精品| 熟妇无码乱子成人精品| 久久精品国产亚洲香蕉| 久久亚洲国产午夜精品理论片| 99久久成人国产精品免费| 久久久久久久亚洲精品| 国产内地精品毛片视频| heyzo加勒比高清国产精品| 国产日韩在线观看视频网站| 久久久无码精品亚洲日韩蜜臀浪潮 | 久久e热在这里只有国产中文精品99 | 99久久国产综合精品swag| 国产成人精品电影在线观看| 精品国产日产一区二区三区| 国产中老年妇女精品| 日韩A∨精品日韩在线观看| 99久久免费国产精品热| 国产在线91精品入口| CAOPORN国产精品免费视频| 久久精品国产精品亚洲| 精品熟女少妇aⅴ免费久久| 岛国精品一区免费视频在线观看| 久久国产精品二国产精品| 国产亚洲精品激情都市| 91在线视频精品| 国产精品揄拍100视频| 久久精品国产亚洲夜色AV网站| 久久国产乱子伦精品免费一| 色欲久久久天天天综合网精品|