apache spark - How can I change SparkContext.sparkUser() setting (in pyspark)? -


i new spark , pyspark.
use pyspark, after rdd processing, tried save hdfs using saveastextfile() function. 'permission denied' error message because pyspark tries write hdfs using local account, 'kjlee', not exist on hdfs system.

i can check spark user name sparkcontext().sparkuser(), can't find how change spark user name.

how can change spark user name?

there environment variable : hadoop_user_name use export hadoop_user_name=anyuser or in pyspark can use os.environ["hadoop_user_name"] = "anyuser"


Comments

Popular posts from this blog

html - Outlook 2010 Anchor (url/address/link) -

javascript - Why does running this loop 9 times take 100x longer than running it 8 times? -

Getting gateway time-out Rails app with Nginx + Puma running on Digital Ocean -