首页 > 网络 > 云计算 >

pyspark用pipe管道调用bash脚本时,遇到PermissionDenied问题

2017-07-19

pyspark用pipe管道调用bash脚本时,遇到PermissionDenied问题,当用pyspark在CDH的yarn集群上运行时,用pipe管道调用bash脚本若遇到如下问题。

pyspark用pipe管道调用bash脚本时,遇到PermissionDenied问题,当用pyspark在CDH的yarn集群上运行时,用pipe管道调用bash脚本若遇到如下问题。

"/usr/lib64/python2.7/subprocess.py", line 1234, in _execute_child raise child_exception OSError: 
[Errno 13] Permission denied
at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRDD.scala:166)
at org.apache.spark.api.python.PythonRunner$$anon$1.(PythonRDD.scala:207)
at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:125)
at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38

解决:
遇到该问题首先想到应该是没有执行权限。
给bash脚本添加执行权限,
chmod +x xx.sh命令

重新提交spark任务,如若还有该问题,则可能该脚本还需要可读或者可写 则设置该脚本所在的目录src权限,
chmod 777 -R src

相关文章
最新文章
热点推荐