站長資訊網
最全最豐富的資訊網站

Hadoop2.5.0偽分布式環境搭建

本章主要介紹下在Linux系統下的Hadoop2.5.0偽分布式環境搭建步驟。首先要搭建Hadoop偽分布式環境,需要完成一些前置依賴工作,包括創建用戶、安裝JDK、關閉防火墻等。

一、創建hadoop用戶

使用root賬戶創建hadoop用戶,為了在實驗環境下便于操作,賦予hadoop用戶sudo權限。具體操作代碼如下:

useradd hadoop # 添加hadoop用戶
passwd hadoop # 設置密碼
visudo
hadoop ALL=(root)NOPASSWD:ALL

二、Hadoop偽分布式環境搭建

1、關閉Linux中的防火墻和selinux

禁用selinux,代碼如下:

sudo vi /etc/sysconfig/selinux # 打開selinux配置文件
SELINUX=disabled # 修改SELINUX屬性值為disabled

關閉防火墻,代碼如下:

sudo service iptables status # 查看防火墻狀態
sudo service iptables stop # 關閉防火墻
sudo chkconfig iptables off # 關閉防火墻開機啟動設置

2、安裝jdk

首先,查看系統中是否有安裝自帶的jdk,如果存在,則先卸載,代碼如下:

rpm -qa | grep java # 查看是否有安裝jdk
sudo rpm -e –nodeps java-1.6.0-openjdk-1.6.0.0-1.50.1.11.5.el6_3.x86_64 tzdata-java-2012j-1.el6.noarch java-1.7.0-openjdk-1.7.0.9-2.3.4.1.el6_3.x86_64 # 卸載自帶jdk

接著,安裝jdk,步驟如下:

step1.解壓安裝包:

tar -zxf jdk-7u67-linux-x64.tar.gz -C /usr/local/

step2.配置環境變量及檢查是否安裝成功:

sudo vi /etc/profile # 打開profile文件
##JAVA_HOME
export JAVA_HOME=/usr/local/jdk1.7.0_67
export PATH=$PATH:$JAVA_HOME/bin

# 生效文件
source /etc/profile # 使用root用戶操作

# 查看是否配置成功
java -version

3、安裝hadoop

step1:解壓hadoop安裝包

tar -zxvf /opt/software/hadoop-2.5.0.tar.gz -C /opt/software/

建議:將/opt/software/hadoop-2.5.0/share下的doc目錄刪除。

step2:修改etc/hadoop目錄下hadoop-env.sh、mapred-env.sh、yarn-env.sh三個配置文件中的JAVA_HOME

export JAVA_HOME=/usr/local/jdk1.7.0_67

step3:修改core-site.xml

<?xml version=”1.0″ encoding=”UTF-8″?>
<?xml-stylesheet type=”text/xsl” href=”configuration.xsl”?>
<!–
  Licensed under the Apache License, Version 2.0 (the “License”);
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an “AS IS” BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
–>

<!– Put site-specific property overrides in this file. –>

<configuration>
    <property>
        <name>name</name>
        <value>my-study-cluster</value>
    </property>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://bigdata01:8020</value>
    </property>
        <!– 指定Hadoop系統生成文件的臨時目錄地址 –>
    <property>
        <name>hadoop.tmp.dir</name>
        <value>/opt/software/hadoop-2.5.0/data/tmp</value>
    </property>
    <property>
        <name>fs.trash.interval</name>
        <value>1440</value>
    </property>
    <property>
        <name>hadoop.http.staticuser.user</name>
        <value>hadoop</value>
    </property>
        <property>
                <name>hadoop.proxyuser.hadoop.hosts</name>
                <value>bigdata01</value>
        </property>
        <property>
                <name>hadoop.proxyuser.hadoop.groups</name>
                <value>*</value>
        </property>
</configuration>

step4:修改hdfs-site.xml

<?xml version=”1.0″ encoding=”UTF-8″?>
<?xml-stylesheet type=”text/xsl” href=”configuration.xsl”?>
<!–
  Licensed under the Apache License, Version 2.0 (the “License”);
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an “AS IS” BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
–>

<!– Put site-specific property overrides in this file. –>

<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
    <property>
        <name>dfs.permissions.enabled</name>
        <value>false</value>
    </property>
    <property>
        <name>dfs.namenode.name.dir</name>
        <value>/opt/software/hadoop-2.5.0/data/name</value>
    </property>
    <property>
        <name>dfs.datanode.data.dir</name>
        <value>/opt/software/hadoop-2.5.0/data/data</value>
    </property>
</configuration>

step5:修改mapred-site.xml

<?xml version=”1.0″?>
<?xml-stylesheet type=”text/xsl” href=”configuration.xsl”?>
<!–
  Licensed under the Apache License, Version 2.0 (the “License”);
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an “AS IS” BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
–>

<!– Put site-specific property overrides in this file. –>

<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
    <property>
        <name>mapreduce.jobhistory.address</name>
        <value>bigdata01:10020</value>
    </property>
    <property>
        <name>mapreduce.jobhistory.webapp.address</name>
        <value>bigdata01:19888</value>
    </property>
</configuration>

step6:修改yarn-site.xml

<?xml version=”1.0″?>
<!–
  Licensed under the Apache License, Version 2.0 (the “License”);
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an “AS IS” BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
–>
<configuration>

<!– Site specific YARN configuration properties –>

    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
    <property>
        <name>yarn.resourcemanager.hostname</name>
        <value>bigdata01</value>
    </property>
    <property>
        <name>yarn.log-aggregation-enable</name>
        <value>true</value>
    </property>
    <property>
        <name>yarn.log-aggregation.retain-seconds</name>
        <value>106800</value>
    </property>
    <property>
        <name>yarn.log.server.url</name>
        <value>http://bigdata01:19888/jobhistory/job/</value>
    </property>
</configuration>

step7:修改slaves文件

bigdata01

step8:格式化namenode

bin/hdfs namenode -format

step9:啟動進程

## 方式一:單獨啟動一個進程
# 啟動namenode
sbin/hadoop-daemon.sh start namenode
# 啟動datanode
sbin/hadoop-daemon.sh start datanode
# 啟動resourcemanager
sbin/yarn-daemon.sh start resourcemanager
# 啟動nodemanager
sbin/yarn-daemon.sh start nodemanager
# 啟動secondarynamenode
sbin/hadoop-daemon.sh start secondarynamenode
# 啟動歷史服務器
sbin/mr-jobhistory-daemon.sh start historyserver

## 方式二:
sbin/start-dfs.sh # 啟動namenode、datanode、secondarynamenode
sbin/start-yarn.sh # 啟動resourcemanager、nodemanager
sbin/mr-jobhistory-daemon.sh start historyserver # 啟動歷史服務器

step10:檢查

1.通過瀏覽器訪問HDFS的外部UI界面,加上外部交互端口號:50070

  http://bigdata01:50070

2.通過瀏覽器訪問YARN的外部UI界面,加上外部交互端口號:8088

  http://bigdata01:8088

3.執行Wordcount程序

  bin/yarn jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.5.0.jar wordcount input output

  注:輸入輸出目錄自定義

結束!

以上為Hadoop2.5.0偽分布式環境搭建步驟,如有問題,請指出,謝謝!

贊(0)
分享到: 更多 (0)
網站地圖   滬ICP備18035694號-2    滬公網安備31011702889846號
国产一区二区三区精品视频| 日韩不卡中文字幕| 亚洲AV无码精品蜜桃| 亚洲国产成人久久精品影视| 二区久久国产乱子伦免费精品| 日韩在线视频免费看| 凹凸国产熟女精品视频| 精品国产一区二区麻豆| 亚洲精品无码你懂的| 国产精品夜色一区二区三区 | 精品免费久久久久久久| 狼色精品人妻在线视频| 久久精品成人无码观看56| 亚洲精品高清无码视频| 中国精品18videosex性中国| 国内精品视频在线观看| 久久精品亚洲福利| 国产精品无码v在线观看| 一级A毛片免费观看久久精品| 日韩色日韩视频亚洲网站| 亚洲日韩精品无码专区加勒比☆| 日韩人妻无码精品专区| 日韩在线午夜成人影院| 国产乱子伦精品免费无码专区| 国产精品亚洲专区一区| 国产精品手机在线亚洲| 国产精品扒开做爽爽爽的视频| 国产精品成人自拍| 国产亚洲美女精品久久久久| 国产成人青青热久免费精品| 国产伦精品一区二区三区四区 | 国产精品久久久久毛片真精品| 精品伊人久久久香线蕉| 国产美女精品人人做人人爽| 国产精品久久久久三级| 国产成人精品一区二三区| 精品国产日韩亚洲一区| 亚洲av日韩av永久无码电影| 国产精品福利一区二区久久| 熟妇人妻VA精品中文字幕| 杨幂国产精品福利在线观看|