热线电话:13121318867

登录
2019-02-23 阅读量: 1001
运行spark job失败

日志信息如下,这是什么问题?要怎么解决

[I 18:40:48.389 NotebookApp] Adapting to protocol v5.1 for kernel d27b7c29-4bda-47b1-97e7-67faf52079c2
[Stage 0:> (0 + 0) / 2]19/02/20 18:41:27 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
19/02/20 18:41:42 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
19/02/20 18:41:57 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources

答:初始化作业的时候executor 没有获得足够的资源导致的,可以尝试在spark-env.sh 中或者 --executor 中增加executor的内存设置。

0.0000
5
关注作者
收藏
评论(0)

发表评论

暂无数据
推荐帖子