CDH 6.3.2集成SparkSQL
创始人
2024-06-01 00:17:05
0

升级背景

CDH6默认没有Spark-SQL,对于代码开发者来说,有没有Spark-SQL都不重要,因为开发者使用SQL语句较少。而对于数据仓库和数据分析人员来说,Hive SQL较慢,Spark-SQL还是比较合适的。但是CDH稍微有点自私,为了力推自家的Impala框架,阉割掉了Spark的SparkSQL工具,也即CDH不自带SparkSQL工具。如果相关工作人员需要在CDH中使用SparkSQL模块,则需要自己独立安装Apache Spark并集成到CDH。

升级调研

那独立安装Spark多少版本呢?调研选择spark-2.4.0-bin-hadoop2.7.tgz,理由如下:

  1. CDH 6.3.2自带的Spark是2.4.0
  2. Spark-2.4.0能兼容到Hadoop 2.7.x(稳定最新版),而CDH 6.3.2又用的Hadoop 3.0.0版本,还好,这并不影响Spark的使用

Spark独立部署

#下载
# wget http://archive.apache.org/dist/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz 
#解压到指定目录,将下载好的压缩包解压到CDH集群服务器的合适目录下
[root@bj-zjk-001 ~]# tar -zxvf /home/spark-2.4.0-bin-hadoop2.7.tgz -C /opt/cloudera/parcels/CDH/lib
#移动,并从命名为spark2
[root@bj-zjk-001 ~]# mv /opt/cloudera/parcels/CDH/lib/spark-2.4.0-bin-hadoop2.7/ /opt/cloudera/parcels/CDH/lib/spark2

修改配置

修改Spark配置

将Spark2/conf下的所有配置全部删除,并将集群Spark的conf文件复制到此处

#删除独立Spark的所有配置
[root@bj-zjk-001 ~]# rm -rf /opt/cloudera/parcels/CDH/lib/spark2/conf/*
#拷贝原Spark的所有配置放到独立Spark的conf目录下
[root@bj-zjk-001 ~]# cp -r /etc/spark/conf/* /opt/cloudera/parcels/CDH/lib/spark2/conf/
#重命名spark-env.sh为spark-env
[root@bj-zjk-001 ~]# mv /opt/cloudera/parcels/CDH/lib/spark2/conf/spark-env.sh /opt/cloudera/parcels/CDH/lib/spark2/conf/spark-env

注意:

  • spark/conf我是取自ResourceManager节点的,在gateway节点上安装spark-sql可能导致Spark无法正常使用YARN。
  • 将spark-env.sh重命名是为了不让spark-sql走Spark环境,而是走Hive源数据库。

修改Hive配置

将Hive的gateway节点的hive-site.xml复制到spark2/conf目录下,不需要做变动

[root@bj-zjk-001 ~]# cp /etc/hive/conf/hive-site.xml /opt/cloudera/parcels/CDH/lib/spark2/conf/

创建spark-sql文件

#创建该命令文件并覆盖如下内容
[root@bj-zjk-001 ~]# vim /opt/cloudera/parcels/CDH/bin/spark-sql
#!/bin/bash  
# Reference: http://stackoverflow.com/questions/59895/can-a-bash-script-tell-what-directory-its-stored-in  
export HADOOP_CONF_DIR=/etc/hadoop/conf
export YARN_CONF_DIR=/etc/hadoop/conf
SOURCE="${BASH_SOURCE[0]}"  
BIN_DIR="$( dirname "$SOURCE" )"  
while [ -h "$SOURCE" ]  
do  SOURCE="$(readlink "$SOURCE")"  [[ $SOURCE != /* ]] && SOURCE="$BIN_DIR/$SOURCE"  BIN_DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )"  
done  
BIN_DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )"  
LIB_DIR=$BIN_DIR/../lib  
export HADOOP_HOME=$LIB_DIR/hadoop  # Autodetect JAVA_HOME if not defined  
. $LIB_DIR/bigtop-utils/bigtop-detect-javahome  exec $LIB_DIR/spark2/bin/spark-submit --class org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver "$@"

Spark安装目录分发

分发到其他Spark集群的服务器

#分发安装目录
[root@bj-zjk-001 ~]# scp -r /opt/cloudera/parcels/CDH/lib/spark2 bj-zjk-002:/opt/cloudera/parcels/CDH/lib/
[root@bj-zjk-001 ~]# scp -r /opt/cloudera/parcels/CDH/lib/spark2 bj-zjk-003:/opt/cloudera/parcels/CDH/lib/
#分发spark-sql文件
[root@bj-zjk-001 ~]# scp /opt/cloudera/parcels/CDH/bin/spark-sql bj-zjk-002:/opt/cloudera/parcels/CDH/bin/
[root@bj-zjk-001 ~]# scp /opt/cloudera/parcels/CDH/bin/spark-sql bj-zjk-003:/opt/cloudera/parcels/CDH/bin/

配置快捷方式

所有安装Spark2的服务器都要执行如下命令

[root@bj-zjk-001 ~]# alternatives --install /usr/bin/spark-sql spark-sql /opt/cloudera/parcels/CDH/bin/spark-sql 1
[root@bj-zjk-002 ~]# alternatives --install /usr/bin/spark-sql spark-sql /opt/cloudera/parcels/CDH/bin/spark-sql 1
[root@bj-zjk-003 ~]# alternatives --install /usr/bin/spark-sql spark-sql /opt/cloudera/parcels/CDH/bin/spark-sql 1

授予执行权限

所有安装Spark2的服务器都要执行如下命令

[root@bj-zjk-001 ~]# chmod a+x /opt/cloudera/parcels/CDH/bin/spark-sql
[root@bj-zjk-002 ~]# chmod a+x /opt/cloudera/parcels/CDH/bin/spark-sql
[root@bj-zjk-003 ~]# chmod a+x /opt/cloudera/parcels/CDH/bin/spark-sql

spark-sql测试

如果是集群,随机找一台服务器,执行spark-sql命令即可

[root@bj-zjk-002 ~]# spark-sql --master yarn --deploy-mode client
23/03/09 17:55:50 WARN conf.HiveConf: HiveConf of name hive.vectorized.use.checked.expressions does not exist
23/03/09 17:55:50 WARN conf.HiveConf: HiveConf of name hive.strict.checks.no.partition.filter does not exist
23/03/09 17:55:50 WARN conf.HiveConf: HiveConf of name hive.vectorized.use.vector.serde.deserialize does not exist
23/03/09 17:55:50 WARN conf.HiveConf: HiveConf of name hive.strict.checks.orderby.no.limit does not exist
23/03/09 17:55:50 WARN conf.HiveConf: HiveConf of name hive.vectorized.adaptor.usage.mode does not exist
23/03/09 17:55:50 WARN conf.HiveConf: HiveConf of name hive.vectorized.use.vectorized.input.format does not exist
23/03/09 17:55:50 WARN conf.HiveConf: HiveConf of name hive.vectorized.input.format.excludes does not exist
23/03/09 17:55:50 WARN conf.HiveConf: HiveConf of name hive.strict.checks.bucketing does not exist
23/03/09 17:55:50 WARN conf.HiveConf: HiveConf of name hive.strict.checks.type.safety does not exist
23/03/09 17:55:50 WARN conf.HiveConf: HiveConf of name hive.strict.checks.cartesian.product does not exist
23/03/09 17:55:50 INFO hive.metastore: Trying to connect to metastore with URI thrift://bj-zjk-001:9083
23/03/09 17:55:50 INFO hive.metastore: Connected to metastore.
......
23/03/09 18:02:34 INFO hive.HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
23/03/09 18:02:34 INFO client.HiveClientImpl: Warehouse location for Hive client (version 1.2.2) is /user/hive/warehouse
23/03/09 18:02:34 INFO state.StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
Spark master: yarn, Application Id: application_1677122930420_0016
23/03/09 18:02:35 INFO thriftserver.SparkSQLCLIDriver: Spark master: yarn, Application Id: application_1677122930420_0016
spark-sql> show databases;
23/03/09 18:03:25 INFO codegen.CodeGenerator: Code generated in 243.829568 ms
coder_dim
coder_dwd
coder_dws
coder_ods
cs_ods
cstudy
default
ods_coder
test
Time taken: 1.077 seconds, Fetched 9 row(s)
23/03/09 18:03:25 INFO thriftserver.SparkSQLCLIDriver: Time taken: 1.077 seconds, Fetched 9 row(s)

到此为止,CDH 6.3.2集成Spark SQL成功了。

问题解决

连接SparkSQL报错1

[root@bj-zjk-001 ~]# spark-sql
Exception in thread "main" org.apache.spark.SparkException: When running with master 'yarn' either HADOOP_CONF_DIR or YARN_CONF_DIR must be set in the environment.

解决方案:

#如果在命令行使用,则临时设置即可;如果永久,则配置到全局环境变量配置文件/etc/profile或者用户环境变量配置文件~/.bashrc
[root@bj-zjk-001 ~]# export HADOOP_CONF_DIR=/etc/hadoop/conf

连接SparkSQL报错2

[root@bj-zjk-001 ~]# spark-sql --master yarn --deploy-mode client
23/03/09 17:09:30 WARN conf.HiveConf: HiveConf of name hive.vectorized.use.checked.expressions does not exist
23/03/09 17:09:30 WARN conf.HiveConf: HiveConf of name hive.strict.checks.no.partition.filter does not exist
23/03/09 17:09:30 WARN conf.HiveConf: HiveConf of name hive.vectorized.use.vector.serde.deserialize does not exist
23/03/09 17:09:30 WARN conf.HiveConf: HiveConf of name hive.strict.checks.orderby.no.limit does not exist
23/03/09 17:09:30 WARN conf.HiveConf: HiveConf of name hive.vectorized.adaptor.usage.mode does not exist
23/03/09 17:09:30 WARN conf.HiveConf: HiveConf of name hive.vectorized.use.vectorized.input.format does not exist
23/03/09 17:09:30 WARN conf.HiveConf: HiveConf of name hive.vectorized.input.format.excludes does not exist
23/03/09 17:09:30 WARN conf.HiveConf: HiveConf of name hive.strict.checks.bucketing does not exist
23/03/09 17:09:30 WARN conf.HiveConf: HiveConf of name hive.strict.checks.type.safety does not exist
23/03/09 17:09:30 WARN conf.HiveConf: HiveConf of name hive.strict.checks.cartesian.product does not exist
23/03/09 17:09:31 INFO hive.metastore: Trying to connect to metastore with URI thrift://bj-zjk-001:9083
23/03/09 17:09:31 INFO hive.metastore: Connected to metastore.
23/03/09 17:09:32 INFO session.SessionState: Created local directory: /tmp/189f69c8-8d5f-471e-b8b4-f5b797f46015_resources
23/03/09 17:09:32 INFO session.SessionState: Created HDFS directory: /tmp/hive/hive/189f69c8-8d5f-471e-b8b4-f5b797f46015
23/03/09 17:09:32 INFO session.SessionState: Created local directory: /tmp/root/189f69c8-8d5f-471e-b8b4-f5b797f46015
23/03/09 17:09:32 INFO session.SessionState: Created HDFS directory: /tmp/hive/hive/189f69c8-8d5f-471e-b8b4-f5b797f46015/_tmp_space.db
23/03/09 17:09:32 INFO spark.SparkContext: Running Spark version 2.4.0
23/03/09 17:09:32 INFO spark.SparkContext: Submitted application: SparkSQL::172.24.86.96
23/03/09 17:09:32 INFO spark.SecurityManager: Changing view acls to: root,hive
23/03/09 17:09:32 INFO spark.SecurityManager: Changing modify acls to: root,hive
23/03/09 17:09:32 INFO spark.SecurityManager: Changing view acls groups to:
23/03/09 17:09:32 INFO spark.SecurityManager: Changing modify acls groups to:
23/03/09 17:09:32 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root, hive); groups with view permissions: Set(); users  with modify permissions: Set(root, hive); groups with modify permissions: Set()
23/03/09 17:09:32 INFO util.Utils: Successfully started service 'sparkDriver' on port 37788.
23/03/09 17:09:32 INFO spark.SparkEnv: Registering MapOutputTracker
23/03/09 17:09:32 INFO spark.SparkEnv: Registering BlockManagerMaster
23/03/09 17:09:32 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
23/03/09 17:09:32 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
23/03/09 17:09:32 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-5a798be1-1e89-4048-a8bd-d9c74957a6b3
23/03/09 17:09:32 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
23/03/09 17:09:32 INFO spark.SparkEnv: Registering OutputCommitCoordinator
23/03/09 17:09:32 INFO util.log: Logging initialized @3807ms
23/03/09 17:09:32 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
23/03/09 17:09:32 INFO server.Server: Started @3903ms
23/03/09 17:09:33 INFO server.AbstractConnector: Started ServerConnector@7b6860f9{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
23/03/09 17:09:33 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2f5ac102{/jobs,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@22a6e998{/jobs/json,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@55e42449{/jobs/job,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6dfa915a{/jobs/job/json,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@78054f54{/stages,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@cb7fa71{/stages/json,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3dffc764{/stages/stage,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@654c7d2d{/stages/stage/json,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@26cb5207{/stages/pool,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@15400fff{/stages/pool/json,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@18d910b3{/storage,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1e7ab390{/storage/json,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@625d9132{/storage/rdd,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@77774571{/storage/rdd/json,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@277b8fa4{/environment,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6cd64ee8{/environment/json,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@620c8641{/executors,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2f1d0bbc{/executors/json,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5460b754{/executors/threadDump,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@a9f023e{/executors/threadDump/json,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@c27a3a2{/static,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@25ad4f71{/,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@49faf066{/api,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1422ac7f{/jobs/job/kill,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5e519ad3{/stages/stage/kill,null,AVAILABLE,@Spark}
23/03/09 17:09:33 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://bj-zjk-001:4040
23/03/09 17:09:33 INFO util.Utils: Using initial executors = 0, max of spark.dynamicAllocation.initialExecutors, spark.dynamicAllocation.minExecutors and spark.executor.instances
23/03/09 17:09:33 INFO client.ConfiguredRMFailoverProxyProvider: Failing over to rm109
23/03/09 17:09:33 INFO yarn.Client: Requesting a new application from cluster with 2 NodeManagers
23/03/09 17:09:33 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (10515 MB per container)
23/03/09 17:09:33 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
23/03/09 17:09:33 INFO yarn.Client: Setting up container launch context for our AM
23/03/09 17:09:33 INFO yarn.Client: Setting up the launch environment for our AM container
23/03/09 17:09:33 INFO yarn.Client: Preparing resources for our AM container
23/03/09 17:09:33 INFO yarn.Client: Uploading resource file:/tmp/spark-3404bdd3-65ef-4694-a35a-c91d8369ecce/__spark_conf__784434873262919548.zip -> hdfs://qianfengns/user/hive/.sparkStaging/application_1677122930420_0009/__spark_conf__.zip
23/03/09 17:09:33 INFO spark.SecurityManager: Changing view acls to: root,hive
23/03/09 17:09:33 INFO spark.SecurityManager: Changing modify acls to: root,hive
23/03/09 17:09:33 INFO spark.SecurityManager: Changing view acls groups to:
23/03/09 17:09:33 INFO spark.SecurityManager: Changing modify acls groups to:
23/03/09 17:09:33 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root, hive); groups with view permissions: Set(); users  with modify permissions: Set(root, hive); groups with modify permissions: Set()
23/03/09 17:09:34 INFO yarn.Client: Submitting application application_1677122930420_0009 to ResourceManager
23/03/09 17:09:35 INFO impl.YarnClientImpl: Submitted application application_1677122930420_0009
23/03/09 17:09:35 INFO cluster.SchedulerExtensionServices: Starting Yarn extension services with app application_1677122930420_0009 and attemptId None
23/03/09 17:09:36 INFO yarn.Client: Application report for application_1677122930420_0009 (state: ACCEPTED)
23/03/09 17:09:36 INFO yarn.Client:client token: N/Adiagnostics: AM container is launched, waiting for AM container to Register with RMApplicationMaster host: N/AApplicationMaster RPC port: -1queue: root.users.hivestart time: 1678352974930final status: UNDEFINEDtracking URL: http://bj-zjk-002:8088/proxy/application_1677122930420_0009/user: hive
23/03/09 17:09:37 INFO yarn.Client: Application report for application_1677122930420_0009 (state: ACCEPTED)
23/03/09 17:09:37 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> bj-zjk-001,bj-zjk-002, PROXY_URI_BASES -> http://bj-zjk-001:8088/proxy/application_1677122930420_0009,http://bj-zjk-002:8088/proxy/application_1677122930420_0009, RM_HA_URLS -> bj-zjk-001:8088,bj-zjk-002:8088), /proxy/application_1677122930420_0009
23/03/09 17:09:37 ERROR server.TransportRequestHandler: Error while invoking RpcHandler#receive() on RPC id 5010315192928208562
java.lang.ClassNotFoundException: org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages$RetrieveDelegationTokens$at java.net.URLClassLoader.findClass(URLClassLoader.java:381)at java.lang.ClassLoader.loadClass(ClassLoader.java:424)at java.lang.ClassLoader.loadClass(ClassLoader.java:357)at java.lang.Class.forName0(Native Method)at java.lang.Class.forName(Class.java:348)at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1863)at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1746)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2037)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)at java.io.ObjectInputStream.readObject(ObjectInputStream.java:428)at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:271)at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:320)at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:270)at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:269)at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:611)at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:662)at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:647)at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:181)at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:103)at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935)at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:138)at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)at java.lang.Thread.run(Thread.java:748)
23/03/09 17:09:37 INFO ui.JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /jobs, /jobs/json, /jobs/job, /jobs/job/json, /stages, /stages/json, /stages/stage, /stages/stage/json, /stages/pool, /stages/pool/json, /storage, /storage/json, /storage/rdd, /storage/rdd/json, /environment, /environment/json, /executors, /executors/json, /executors/threadDump, /executors/threadDump/json, /static, /, /api, /jobs/job/kill, /stages/stage/kill.
23/03/09 17:09:38 INFO yarn.Client: Application report for application_1677122930420_0009 (state: ACCEPTED)
23/03/09 17:09:39 INFO yarn.Client: Application report for application_1677122930420_0009 (state: ACCEPTED)
23/03/09 17:09:40 INFO yarn.Client: Application report for application_1677122930420_0009 (state: ACCEPTED)
23/03/09 17:09:40 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> bj-zjk-001,bj-zjk-002, PROXY_URI_BASES -> http://bj-zjk-001:8088/proxy/application_1677122930420_0009,http://bj-zjk-002:8088/proxy/application_1677122930420_0009, RM_HA_URLS -> bj-zjk-001:8088,bj-zjk-002:8088), /proxy/application_1677122930420_0009
23/03/09 17:09:40 INFO ui.JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /jobs, /jobs/json, /jobs/job, /jobs/job/json, /stages, /stages/json, /stages/stage, /stages/stage/json, /stages/pool, /stages/pool/json, /storage, /storage/json, /storage/rdd, /storage/rdd/json, /environment, /environment/json, /executors, /executors/json, /executors/threadDump, /executors/threadDump/json, /static, /, /api, /jobs/job/kill, /stages/stage/kill.
23/03/09 17:09:40 ERROR server.TransportRequestHandler: Error while invoking RpcHandler#receive() on RPC id 8383681388809426531
java.lang.ClassNotFoundException: org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages$RetrieveDelegationTokens$at java.net.URLClassLoader.findClass(URLClassLoader.java:381)at java.lang.ClassLoader.loadClass(ClassLoader.java:424)at java.lang.ClassLoader.loadClass(ClassLoader.java:357)at java.lang.Class.forName0(Native Method)at java.lang.Class.forName(Class.java:348)at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1863)at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1746)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2037)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)at java.io.ObjectInputStream.readObject(ObjectInputStream.java:428)at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:271)at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:320)at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:270)at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:269)at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:611)at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:662)at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:647)at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:181)at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:103)at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935)at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:138)at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)at java.lang.Thread.run(Thread.java:748)
23/03/09 17:09:41 INFO yarn.Client: Application report for application_1677122930420_0009 (state: FINISHED)
23/03/09 17:09:41 INFO yarn.Client:client token: N/Adiagnostics: Uncaught exception: org.apache.spark.SparkException: Exception thrown in awaitResult:at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:226)at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:92)at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:76)at org.apache.spark.deploy.yarn.ApplicationMaster.createAllocator(ApplicationMaster.scala:393)at org.apache.spark.deploy.yarn.ApplicationMaster.runExecutorLauncher(ApplicationMaster.scala:497)at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:277)at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:805)at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:804)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:422)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:804)at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:836)at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages$RetrieveDelegationTokens$at java.net.URLClassLoader.findClass(URLClassLoader.java:381)at java.lang.ClassLoader.loadClass(ClassLoader.java:424)at java.lang.ClassLoader.loadClass(ClassLoader.java:357)at java.lang.Class.forName0(Native Method)at java.lang.Class.forName(Class.java:348)at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1863)at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1746)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2037)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)at java.io.ObjectInputStream.readObject(ObjectInputStream.java:428)at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:271)at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:320)at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:270)at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:269)at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:611)at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:662)at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:647)at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:181)at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:103)at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935)at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:138)at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)at java.lang.Thread.run(Thread.java:748)at org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:207)at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:139)at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:53)at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935)at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:138)at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)at java.lang.Thread.run(Thread.java:748)ApplicationMaster host: 172.24.86.96ApplicationMaster RPC port: -1queue: root.users.hivestart time: 1678352974930final status: FAILEDtracking URL: http://bj-zjk-002:8088/proxy/application_1677122930420_0009/user: hive
23/03/09 17:09:41 INFO yarn.Client: Deleted staging directory hdfs://qianfengns/user/hive/.sparkStaging/application_1677122930420_0009
23/03/09 17:09:41 ERROR cluster.YarnClientSchedulerBackend: The YARN application has already ended! It might have been killed or the Application Master may have failed to start. Check the YARN application logs for more details.
23/03/09 17:09:41 ERROR spark.SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Uncaught exception: org.apache.spark.SparkException: Exception thrown in awaitResult:at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:226)at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:92)at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:76)at org.apache.spark.deploy.yarn.ApplicationMaster.createAllocator(ApplicationMaster.scala:393)at org.apache.spark.deploy.yarn.ApplicationMaster.runExecutorLauncher(ApplicationMaster.scala:497)at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:277)at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:805)at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:804)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:422)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:804)at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:836)at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages$RetrieveDelegationTokens$at java.net.URLClassLoader.findClass(URLClassLoader.java:381)at java.lang.ClassLoader.loadClass(ClassLoader.java:424)at java.lang.ClassLoader.loadClass(ClassLoader.java:357)at java.lang.Class.forName0(Native Method)at java.lang.Class.forName(Class.java:348)at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1863)at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1746)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2037)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)at java.io.ObjectInputStream.readObject(ObjectInputStream.java:428)at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:271)at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:320)at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:270)at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:269)at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:611)at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:662)at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:647)at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:181)at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:103)at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935)at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:138)at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)at java.lang.Thread.run(Thread.java:748)at org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:207)at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:139)at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:53)at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935)at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:138)at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)at java.lang.Thread.run(Thread.java:748)at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:94)at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:63)at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:178)at org.apache.spark.SparkContext.(SparkContext.scala:501)at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)at scala.Option.getOrElse(Option.scala:121)at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:48)at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.(SparkSQLCLIDriver.scala:315)at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:166)at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
23/03/09 17:09:41 INFO server.AbstractConnector: Stopped Spark@7b6860f9{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
23/03/09 17:09:41 INFO ui.SparkUI: Stopped Spark web UI at http://bj-zjk-001:4040
23/03/09 17:09:41 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered!
23/03/09 17:09:41 INFO cluster.YarnClientSchedulerBackend: Shutting down all executors
23/03/09 17:09:41 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down
23/03/09 17:09:41 INFO cluster.SchedulerExtensionServices: Stopping SchedulerExtensionServices
(serviceOption=None,services=List(),started=false)
23/03/09 17:09:41 INFO cluster.YarnClientSchedulerBackend: Stopped
23/03/09 17:09:41 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
23/03/09 17:09:41 INFO memory.MemoryStore: MemoryStore cleared
23/03/09 17:09:41 INFO storage.BlockManager: BlockManager stopped
23/03/09 17:09:41 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
23/03/09 17:09:41 WARN metrics.MetricsSystem: Stopping a MetricsSystem that is not running
23/03/09 17:09:41 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
23/03/09 17:09:41 INFO spark.SparkContext: Successfully stopped SparkContext
Exception in thread "main" org.apache.spark.SparkException: Uncaught exception: org.apache.spark.SparkException: Exception thrown in awaitResult:at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:226)at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:92)at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:76)at org.apache.spark.deploy.yarn.ApplicationMaster.createAllocator(ApplicationMaster.scala:393)at org.apache.spark.deploy.yarn.ApplicationMaster.runExecutorLauncher(ApplicationMaster.scala:497)at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:277)at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:805)at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:804)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:422)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:804)at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:836)at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages$RetrieveDelegationTokens$at java.net.URLClassLoader.findClass(URLClassLoader.java:381)at java.lang.ClassLoader.loadClass(ClassLoader.java:424)at java.lang.ClassLoader.loadClass(ClassLoader.java:357)at java.lang.Class.forName0(Native Method)at java.lang.Class.forName(Class.java:348)at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1863)at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1746)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2037)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)at java.io.ObjectInputStream.readObject(ObjectInputStream.java:428)at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:271)at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:320)at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:270)at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:269)at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:611)at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:662)at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:647)at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:181)at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:103)at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935)at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:138)at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)at java.lang.Thread.run(Thread.java:748)at org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:207)at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:139)at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:53)at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935)at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:138)at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)at java.lang.Thread.run(Thread.java:748)at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:94)at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:63)at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:178)at org.apache.spark.SparkContext.(SparkContext.scala:501)at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)at scala.Option.getOrElse(Option.scala:121)at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:48)at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.(SparkSQLCLIDriver.scala:315)at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:166)at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
23/03/09 17:09:41 INFO util.ShutdownHookManager: Shutdown hook called
23/03/09 17:09:41 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-358a54bc-b195-40a8-9833-e1bf1cdadea1
23/03/09 17:09:41 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-3404bdd3-65ef-4694-a35a-c91d8369ecce

解决方案:

  1. 如果集群调用,需要修改每台服务器上的spark-defaults.conf配置文件,修改如下:

    #修改spark.yarn.jars=值为spark2的jars目录;注释最后的两行即可
    [root@bj-zjk-001 ~]# vim /opt/cloudera/parcels/CDH/lib/spark2/conf/spark-defaults.conf
    spark.authenticate=false
    spark.driver.log.dfsDir=/user/spark/driverLogs
    spark.driver.log.persistToDfs.enabled=true
    spark.dynamicAllocation.enabled=true
    spark.dynamicAllocation.executorIdleTimeout=60
    spark.dynamicAllocation.minExecutors=0
    spark.dynamicAllocation.schedulerBacklogTimeout=1
    spark.eventLog.enabled=true
    spark.io.encryption.enabled=false
    spark.network.crypto.enabled=false
    spark.serializer=org.apache.spark.serializer.KryoSerializer
    spark.shuffle.service.enabled=true
    spark.shuffle.service.port=7337
    spark.ui.enabled=true
    spark.ui.killEnabled=true
    spark.lineage.log.dir=/var/log/spark/lineage
    spark.lineage.enabled=true
    spark.master=yarn
    spark.submit.deployMode=client
    spark.eventLog.dir=hdfs://qianfengns/user/spark/applicationHistory
    spark.yarn.historyServer.address=http://bj-zjk-001:18088
    spark.yarn.jars=local:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/spark2/jars/*,local:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/spark/hive/*
    spark.driver.extraLibraryPath=/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/native
    spark.executor.extraLibraryPath=/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/native
    spark.yarn.am.extraLibraryPath=/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hadoop/lib/native
    spark.yarn.config.gatewayPath=/opt/cloudera/parcels
    spark.yarn.config.replacementPath={{HADOOP_COMMON_HOME}}/../../..
    spark.yarn.historyServer.allowTracking=true
    spark.yarn.appMasterEnv.MKL_NUM_THREADS=1
    spark.executorEnv.MKL_NUM_THREADS=1
    spark.yarn.appMasterEnv.OPENBLAS_NUM_THREADS=1
    spark.executorEnv.OPENBLAS_NUM_THREADS=1
    #spark.extraListeners=com.cloudera.spark.lineage.NavigatorAppListener
    #spark.sql.queryExecutionListeners=com.cloudera.spark.lineage.NavigatorQueryListener
    
  2. 将spark-defaults.conf配置文件分发到其他spark2服务器

    [root@bj-zjk-001 ~]# scp /opt/cloudera/parcels/CDH/lib/spark2/conf/spark-defaults.conf bj-zjk-002:/opt/cloudera/parcels/CDH/lib/spark2/conf/
    [root@bj-zjk-001 ~]# scp /opt/cloudera/parcels/CDH/lib/spark2/conf/spark-defaults.conf bj-zjk-003:/opt/cloudera/parcels/CDH/lib/spark2/conf/
    

    注意

    如果集群调用,建议在HDFS创建对应的jars目录,然后将其本地的spark2目录下的jar包存储到hdfs服务器中。然后在spark-default.conf文件中配置:spark.yarn.jars=hdfs://qianfengns/user/spark2/jars/*,local:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/spark/hive/

    #创建jars目录
    [root@bj-zjk-001 ~]# hdfs dfs -mkdir -p  hdfs://qianfengns/user/spark2/jars
    #上传的jar包到hdfs
    [root@bj-zjk-001 ~]# hdfs dfs -put /opt/cloudera/parcels/CDH/lib/spark2/jars/* hdfs://qianfengns/user/spark2/jars/
    

相关内容

热门资讯

求经典台词和经典旁白 求经典台词和经典旁白谁有霹雳布袋戏里的经典对白和经典旁白啊?朋友,你尝过失去的滋味吗? 很多人在即将...
小王子第二章主要内容概括 小王子第二章主要内容概括小王子第二章主要内容概括小王子第二章主要内容概括
爱情睡醒了第15集里刘小贝和项... 爱情睡醒了第15集里刘小贝和项天骐跳舞时唱的那首歌是什么谢谢开始找舞伴的时候是林俊杰的《背对背拥抱》...
世界是什么?世界是什么概念?可... 世界是什么?世界是什么概念?可以干什么?物质的和意识的 除了我们生活的地方 比方说山 河 公路 ...
全职猎人中小杰和奇牙拿一集被抓 全职猎人中小杰和奇牙拿一集被抓动画片是第五十九集,五十八集被发现,五十九被带回基地,六十逃走
“不周山”意思是什么 “不周山”意思是什么快快快快......一座山,神话里被共工撞倒了。
《揭秘》一元一分15张跑得快群... 一元一分麻将群加群主微【ab120590】【tj525555】 【mj120590】等风也等你。喜欢...
玩家必看手机正规红中麻将群@2... 好运连连,全网推荐:(ab120590)(mj120590)【tj525555】-Q号:(QQ443...
始作俑者15张跑的快群@24小... 微信一元麻将群群主微【ab120590】 【tj525555】【mj120590】一元一分群内结算,...
《重大通知》24小时一元红中麻... 加V【ab120590】【tj525555】【mj120590】红中癞子、跑得快,等等,加不上微信就...
盘点一下正规一块红中麻将群@2... 一元一分麻将群加群主微:微【ab120590】 【mj120590】【tj525555】喜欢手机上打...
(免押金)上下分一元一分麻将群... 微【ab120590】 【mj120590】【tj525555】专业麻将群三年房费全网最低,APP苹...
[解读]正规红中麻将跑的快@群... 微信一元麻将群群主微【ab120590】 【tj525555】【mj120590】一元一分群内结算,...
《普及一下》全天24小时红中... 微【ab120590】 【mj120590】【tj525555】专业麻将群三年房费全网最低,APP苹...
优酷视频一元一分正规红中麻将... 好运连连,全网推荐:(ab120590)(mj120590)【tj525555】-Q号:(QQ443...
《火爆》加入附近红中麻将群@(... 群主微【ab120590】 【mj120590】【tj525555】免带押进群,群内跑包包赔支持验证...
《字节跳动》哪里有一元一分红中... 1.进群方式-[ab120590]或者《mj120590》【tj525555】--QQ(QQ4434...
全网普及红中癞子麻将群@202... 好运连连,全网推荐:(ab120590)(mj120590)【tj525555】-Q号:(QQ443...
「独家解读」一元一分麻将群哪里... 1.进群方式《ab120590》或者《mj120590》《tj525555》--QQ(4434063...
通知24小时不熄火跑的快群@2... 1.进群方式《ab120590》或者《mj120590》《tj525555》--QQ(4434063...