安装包准备
JDK1.7.X安装包(64位)Hadoop2.6.0安装包(64位)环境准备
1.设置IP地址
执行命令:service network restart 验证:ifconfig
2.关闭防火墙
执行命令:service iptables stop 验证:service iptables status
3.关闭防火墙的自动运行
执行命令:chkconfig iptables off 验证:chkconfig --list | grep iptables
4.设置主机名
执行命令:vi /etc/sysconfig/network
5.IP与主机名绑定
执行命令:vi /etc/hosts
6.设置ssh免密码登陆
执行命令:(1)ssh-keygen -t rsa (2)cp ~/.ssh/id_rsa.pub ~/.ssh/authorized_keys
安装jdk
1.执行命令cd /usr/local
chmod u+x jdk-7u79-linux-x64.tar.gz tar -zxvf jdk-7u79-linux-x64.tar.gz //解压jdk mv jdk1.7.0_79 jdk //将 jdk1.7.0_79改为jdk
配置jdk环境变量vi /etc/profile 增加如下内容:
export JAVA_HOME=/usr/local/jdk export PATH=.:$JAVA_HOME/bin:$PATH3.执行命令source /etc/profile
安装hadoop2.6.0
1.在 /usr/local目录下解压hadoop2.6.0安装包 2.将解压后的文件夹改为hadoop 3.修改/etc/profile文件
export HADOOP_HOME=/usr/local/hadoop export PATH=.:$HADOOP_HOME/bin:$PATH执行命令source /etc/profile 执行cd /usr/local/hadoop/etc/hadoop命令,更改配置文件。
更改配置文件
#hadoop-env.sh文件
export JAVA_HOME=/usr/local/jdk#core-site.xml文件
<configuration> <property> <name>hadoop.tmp.dir</name> <value>/usr/hadoop/tmp</value> </property> <property> <name>fs.defaultFS</name> <value>hdfs://localhost:9000</value> </property> </configuration>#hdfs-site.xml文件
<configuration> <property> <name>dfs.replication</name> <value>1</value> </property> <property> <name>dfs.namenode.name.dir</name> <value>file:/usr/hadoop/dfs/name</value> </property> <property> <name>dfs.datanode.data.dir</name> <value>file:/usr/hadoop/dfs/data</value> </property> <property> <name>dfs.permissions</name> <value>false</value> </property> </configuration>#yarn-site.xml文件
> <configuration> > <property> > <name>mapreduce.framework.name</name> > <value>yarn</value> > </property> > <property> > <name>yarn.nodemanager.aux-services</name> > <value>mapreduce_shuffle</value> > </property> ></configuration>对namenode进行初始化(/usr/local/hadoop目录下)
命令:bin/hadoop namenode –format 启动namenode: sbin/hadoop-daemon.sh start namenode 启动datanode: sbin/hadoop-daemon.sh start datanode 启动Yarn 命令:sbin/start-yarn.sh
在sbin目录下启动hadoop: start-all.sh 停止所有的hadoop进程: stop-all.sh
