spark 集群安装

Last updated on November 22, 2024 pm

🧙 Questions

安装spark集群(3.1.1)

☄️ Ideas

hive on yarn 模式,需要读取hdfs位置,根据core-site.xml文件来确定的

vim /etc/profile

export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop 
export HDFS_CONF_DIR=$HADOOP_HOME/etc/hadoop 
export YARN_CONF_DIR=$HADOOP_HOME/etc/hadoop
下载文件
cd /tmp/
nohup wget https://archive.apache.org/dist/spark/spark-3.1.1/spark-3.1.1-bin-hadoop3.2.tgz >> download_spark.log 2>&1 &  
tail -f download_spark.log

scp /tmp/spark-3.1.1-bin-hadoop3.2.tgz root@slave1:/tmp/
scp /tmp/spark-3.1.1-bin-hadoop3.2.tgz root@slave2:/tmp/
scp /tmp/spark-3.1.1-bin-hadoop3.2.tgz root@slave3:/tmp/
scp /tmp/spark-3.1.1-bin-hadoop3.2.tgz root@slave4:/tmp/

tar -vzxf /tmp/spark-3.1.1-bin-hadoop3.2.tgz -C /opt
ln -s /opt/spark-3.1.1-bin-hadoop3.2 /opt/spark

vim /etc/profile
# === vim /etc/profile ===
export SPARK_HOME=/opt/spark 
export PATH=$PATH:$SPARK_HOME/bin  
# === vim /etc/profile ===
source /etc/profile

配置属性

cp /opt/spark/conf/spark-env.sh.template /opt/spark/conf/spark-env.sh
vim /opt/spark/conf/spark-env.sh

# === vim /opt/spark/conf/spark-env.sh ===
export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk
export HADOOP_CONF_DIR=/opt/hadoop/ect/hadoop
export YARN_CONF_DIR=/opt/hadoop/ect/hadoop
export SPARK_CONF_DIR=/opt/spark/conf/
export SPARK_EXECUTOR_CORES=1
export SPARK_EXECUTOR_MEMORY=1G
export SPARK_DRIVER_MEMORY=1G
export SPARK_MASTER_IP=master
export SPARK_MASTER_WEBUI_PORT=8080
export SPARK_DAEMON_JAVA_OPTS="-Dspark.deploy.recoveryMode=ZOOKEEPER -Dspark.deploy.zookeeper.url=master:2181,slave1:2181,slave2:2181,slave3:2181,slave4:2181 -Dspark.deploy.zookeeper.dir=/spark"
# === vim /opt/spark/conf/spark-env.sh ===
cp /opt/spark/conf/workers.template /opt/spark/conf/workers
vim /opt/spark/conf/workers

# === vim /opt/spark/conf/workers ===
master
slave1
slave2
slave3
slave4
# === vim /opt/spark/conf/workers ===

分发spark

scp /opt/spark/conf/* root@slave1:/opt/spark/conf/
scp /opt/spark/conf/* root@slave2:/opt/spark/conf/
scp /opt/spark/conf/* root@slave3:/opt/spark/conf/
scp /opt/spark/conf/* root@slave4:/opt/spark/conf/

启动spark

bash /opt/spark/sbin/start-all.sh
bash /opt/spark/sbin/stop-all.sh
# 查看日志

检查


spark 集群安装
https://ispong.isxcode.com/hadoop/spark/spark 集群安装/
Author
ispong
Posted on
November 3, 2021
Licensed under