hadoop及生态圈组件运维安装脚本(Hadoop operation and maintenance installation script)


github地址

脚本主要用于方便学习,从不断配置环境中解脱出来,
版本为hadoop2最后一个版本,可自行修改脚本调整

版本信息为:

hadoop 2.9.2

hive 1.2.2(使用MapReduce引擎)

mysql 5.7.6

jdbc 5.1.48

centos

切换为root用户
su root
(可选)centos7初始化脚本,centos其他版本可将脚本内换源地址修改为想用版本号

主要为yum换源及基础工具安装

cd ~
curl -O https://cdn.jsdelivr.net/gh/shawn-ms/HadoopScript/centos/centos7_init.sh
chmod 766 centos7_init.sh
./centos7_init.sh
安装hadoop
cd ~
curl -O https://cdn.jsdelivr.net/gh/shawn-ms/HadoopScript/centos/centos_hadoop_install.sh
chmod 766 centos_hadoop_install.sh
./centos_hadoop_install.sh


[scode type="red"]执行1.安装hadoop前要先执行2 ssh localhost[/scode]

[scode type="yellow"]报错解决1:ssh: connect to host localhost port 22: Connection refused[/scode]

ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
chmod 0600 ~/.ssh/authorized_keys

[button color="success" icon="" url="https://kontext.tech/column/hadoop/307/install-hadoop-320-on-windows-10-using-windows-subsystem-for-linux-wsl" type=""]解决方案链接[/button]
[scode type="yellow"]报错2[/scode]
Could not load host key: /etc/ssh/ssh_host_rsa_key
Could not load host key: /etc/ssh/ssh_host_ecdsa_key
Could not load host key: /etc/ssh/ssh_host_ed25519_key
[button color="success" icon="" url="http://blog.chinaunix.net/uid-26168435-id-5732463.html" type=""]解决方案链接[/button]

安装mysql
cd ~
curl -O https://cdn.jsdelivr.net/gh/shawn-ms/HadoopScript/centos/centos_mysql.sh
chmod 766 centos_mysql.sh
./centos_mysql.sh
安装hive(安装hive前,请安装启动hadoop,mysql)
source /etc/profile
cd ~
curl -O https://cdn.jsdelivr.net/gh/shawn-ms/HadoopScript/centos/hive.sh
chmod 766 hive.sh
./hive.sh

ubuntu

切换为root用户
su root

ubuntu 默认非root登录,首次需要设置root密码 passwd root

(可选)ubuntu初始化脚本

主要为apt换源及基础工具安装

cd ~
curl -O https://file.masheng.fun/shell/ubuntu/ubuntu_init.sh
chmod 766 ubuntu_init.sh
./ubuntu_init.sh
安装hadoop
cd ~
curl -O https://cdn.jsdelivr.net/gh/shawn-ms/HadoopScript/ubuntu/ubuntu_hadoop_install.sh
chmod 766 ubuntu_hadoop_install.sh
./ubuntu_hadoop_install.sh


执行1.安装hadoop前要先执行2 ssh localhost

安装mysql

可以参照
[post cid="60" /]
需打开远程访问,将密码设为 123456

安装hive(安装hive前,请安装启动hadoop,mysql)
source /etc/profile
cd ~
curl -O https://cdn.jsdelivr.net/gh/shawn-ms/HadoopScript/ubuntu/hive.sh
chmod 766 hive.sh
./hive.sh

声明:sma's blog|版权所有,违者必究|如未注明,均为原创|本网站采用BY-NC-SA协议进行授权

转载:转载请注明原文链接 - hadoop及生态圈组件运维安装脚本(Hadoop operation and maintenance installation script)


做一个温柔而又有力量的人