apache spark - How can I change SparkContext.sparkUser() setting (in pyspark)? -


i new spark , pyspark.
use pyspark, after rdd processing, tried save hdfs using saveastextfile() function. 'permission denied' error message because pyspark tries write hdfs using local account, 'kjlee', not exist on hdfs system.

i can check spark user name sparkcontext().sparkuser(), can't find how change spark user name.

how can change spark user name?

there environment variable : hadoop_user_name use export hadoop_user_name=anyuser or in pyspark can use os.environ["hadoop_user_name"] = "anyuser"


Comments

Popular posts from this blog

java - WARN : org.springframework.web.servlet.PageNotFound - No mapping found for HTTP request with URI [/board/] in DispatcherServlet with name 'appServlet' -

html - Outlook 2010 Anchor (url/address/link) -

android - How to create dynamically Fragment pager adapter -