spark-配置
Posted
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了spark-配置相关的知识,希望对你有一定的参考价值。
Spark Properties
Name | Value |
---|---|
spark.app.id | application_1504276062372_628765 |
spark.app.name | SparkSQL_Expand_Visview |
spark.driver.appUIAddress | http://10.10.10.171:4050 |
spark.driver.cores | 2 |
spark.driver.extraClassPath | :/usr/lib/hadoop/lib/hadoop-lzo-0.4.20-SNAPSHOT.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/datacenter/panel/lidl/spark/spark-1.6.2-bin-hadoop2.3/lib/joda-time-2.9.4.jar:/datacenter/panel/lidl/spark/spark-1.6.2-bin-hadoop2.3/lib/nscala-time_2.10-2.14.0.jar |
spark.driver.extraJavaOptions | -XX:+UseG1GC -XX:+DisableExplicitGC -XX:-ResizePLAB -XX:+ParallelRefProcEnabled -XX:+UseCompressedOops |
spark.driver.host | 10.10.10.171 |
spark.driver.memory | 5G |
spark.driver.port | 16021 |
spark.eventLog.compress | true |
spark.eventLog.dir | hdfs://adhnamenode:8020/user/beginner/spark_hist |
spark.eventLog.enabled | true |
spark.executor.cores | 2 |
spark.executor.extraClassPath | :/usr/lib/hadoop/lib/hadoop-lzo-0.4.20-SNAPSHOT.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/datacenter/panel/lidl/spark/spark-1.6.2-bin-hadoop2.3/lib/joda-time-2.9.4.jar:/datacenter/panel/lidl/spark/spark-1.6.2-bin-hadoop2.3/lib/nscala-time_2.10-2.14.0.jar |
spark.executor.extraJavaOptions | -XX:+UseG1GC -XX:+DisableExplicitGC -XX:-ResizePLAB -XX:+ParallelRefProcEnabled -XX:+UseCompressedOops |
spark.executor.id | driver |
spark.executor.instances | 20 |
spark.executor.memory | 6G |
spark.history.fs.logDirectory | hdfs://adhnamenode:8020/user/beginner/spark_hist |
spark.io.compression.lz4.blockSize | 64k |
spark.jars | file:/datacenter/panel/visiable/spark-2.1.0-bin-hadoop2.6/track_expand_visview-assembly-1.0.jar |
spark.locality.wait | 2s |
spark.master | yarn |
spark.memory.fraction | 0.75 |
spark.network.timeout | 300s |
spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_HOSTS | mainRM.master.adh |
spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_URI_BASES | http://mainRM.master.adh:8088/proxy/application_1504276062372_628765 |
spark.reducer.maxSizeInFlight | 96m |
spark.scheduler.mode | FIFO |
spark.serializer | org.apache.spark.serializer.KryoSerializer |
spark.shuffle.file.buffer | 128k |
spark.speculation | true |
spark.speculation.quantile | 0.98 |
spark.streaming.backpressure.enabled | true |
spark.streaming.kafka.maxRetries | 10000 |
spark.streaming.unpersist | true |
spark.submit.deployMode | client |
spark.ui.filters | org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter |
spark.yarn.access.namenodes | hdfs://mainNN.master.adh:8020/,hdfs://backNN.master.adh:8020/ |
spark.yarn.containerLauncherMaxThreads | 50 |
spark.yarn.driver.memoryOverhead | 1024 |
spark.yarn.executor.memoryOverhead | 1024 |
spark.yarn.historyServer.address | 10.10.10.178 |
spark.yarn.max.executor.failures | 10000 |
spark.yarn.submit.waitAppCompletion | false |
以上是关于spark-配置的主要内容,如果未能解决你的问题,请参考以下文章