on Amazon EMR 4.0.0, setting /etc/spark/conf/spark-env.conf is ineffective -


i'm launching spark-based hiveserver2 on amazon emr, has classpath dependency. due bug in amazon emr:

https://petz2000.wordpress.com/2015/08/18/get-blas-working-with-spark-on-amazon-emr/

my classpath cannot submitted through "--driver-class-path" option

so i'm bounded modify /etc/spark/conf/spark-env.conf add classpath:

# add hadoop libraries spark classpath spark_classpath="${spark_classpath}:${hadoop_home}/*:${hadoop_home}/../hadoop-hdfs/*:${hadoop_home}/../hadoop-mapreduce/*:${hadoop_home}/../hadoop-yarn/*:/home/hadoop/git/datapassport/*" 

where "/home/hadoop/git/datapassport/*" classpath.

however after launching server successfully, spark environment parameter shows change ineffective:

spark.driver.extraclasspath :/usr/lib/hadoop/*:/usr/lib/hadoop/../hadoop-hdfs/*:/usr/lib/hadoop/../hadoop-mapreduce/*:/usr/lib/hadoop/../hadoop-yarn/*:/etc/hive/conf:/usr/lib/hadoop/../hadoop-lzo/lib/*:/usr/share/aws/emr/emrfs/conf:/usr/share/aws/emr/emrfs/lib/*:/usr/share/aws/emr/emrfs/auxlib/* 

is configuration file obsolete? new file , how fix problem?

have tried setting spark.driver.extraclasspath in spark-defaults? this:

[   {     "classification": "spark-defaults",     "properties": {       "spark.driver.extraclasspath": "${spark_classpath}:${hadoop_home}/*:${hadoop_home}/../hadoop-hdfs/*:${hadoop_home}/../hadoop-mapreduce/*:${hadoop_home}/../hadoop-yarn/*:/home/hadoop/git/datapassport/*"     }   } ] 

Comments

Popular posts from this blog

html - Outlook 2010 Anchor (url/address/link) -

javascript - Why does running this loop 9 times take 100x longer than running it 8 times? -

Getting gateway time-out Rails app with Nginx + Puma running on Digital Ocean -