apache spark - How can I change SparkContext.sparkUser() setting (in pyspark)? -
i new spark
, pyspark
.
use pyspark, after rdd
processing, tried save hdfs
using saveastextfile()
function. 'permission denied' error message because pyspark tries write hdfs
using local account, 'kjlee', not exist on hdfs
system.
i can check spark user name sparkcontext().sparkuser()
, can't find how change spark user name.
how can change spark user name?
there environment variable : hadoop_user_name use export hadoop_user_name=anyuser or in pyspark can use os.environ["hadoop_user_name"] = "anyuser"
Comments
Post a Comment